Commit Graph

487 Commits

Author SHA1 Message Date
Pieter Noordhuis ce5a3f2ce6
Upgrade TF provider to 1.48.0 (#1527)
## Changes

This includes a fix for library order not being respected.

## Tests

Manually confirmed the fix works in
https://github.com/databricks/bundle-examples/pull/29.
2024-06-26 09:29:46 +00:00
dependabot[bot] 8468878eed
Bump github.com/databricks/databricks-sdk-go from 0.42.0 to 0.43.0 (#1522)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.42.0 to 0.43.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3e419132ea"><code>3e41913</code></a>
Release v0.43.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/955">#955</a>)</li>
<li><a
href="ce3dc984f7"><code>ce3dc98</code></a>
Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/952">#952</a>)</li>
<li><a
href="00b1d09b24"><code>00b1d09</code></a>
Update OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/947">#947</a>)</li>
<li><a
href="d098b1a3e7"><code>d098b1a</code></a>
Support partners in SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/925">#925</a>)</li>
<li><a
href="490bc13c0e"><code>490bc13</code></a>
Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/944">#944</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.42.0...v0.43.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.42.0&new-version=0.43.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-06-25 12:51:17 +00:00
Pieter Noordhuis 100a0516d4
Add context type and value to path rewriting (#1525)
## Changes

For a future change where the inner rewriting functions need access to
the underlying bundle, this change makes preparations.

All values were passed via the stack before and adding yet another value
would make the code less readable.

## Tests

Unit tests pass.
2024-06-25 10:04:22 +00:00
Gleb Kanterov 5ff06578ac
PythonMutator: replace stdin/stdout with files (#1512)
## Changes
Replace stdin/stdout with files in `PythonMutator`. Files are created in
a temporary directory.

Rename `ApplyPythonMutator` to `PythonMutator`.

Add test for `dyn.Location` behavior during the "load" stage.

## Tests
Unit tests
2024-06-24 07:47:41 +00:00
shreyas-goenka 068c7cfc2d
Return `dyn.InvalidValue` instead of `dyn.NilValue` when errors happen (#1514)
## Changes
With https://github.com/databricks/cli/pull/1507 and
https://github.com/databricks/cli/pull/1511 we are clarifying the
semantics associated with `dyn.InvalidValue` and `dyn.NilValue`. An
invalid value is the default zero value and is used to signals the
complete absence of the value.

A nil value, on the other hand, is a valid value for a piece of
configuration and signals explicitly setting a key to nil in the
configuration tree. In keeping with that theme, this PR returns
`dyn.InvalidValue` instead of `dyn.NilValue` at error sites. This change
is not expected to have a material change in behaviour and is being done
to set the right convention since we have well-defined semantics
associated with both `NilValue` and `InvalidValue`.

## Tests
Unit tests and integration tests pass. Also manually scanned the changes
and the associated call sites to verify the `NilValue` value itself was
not being relied upon.
2024-06-21 14:22:42 +00:00
Pieter Noordhuis 446a9d0c52
Properly deal with nil values in `convert.FromTyped` (#1511)
## Changes

When a configuration defines:
```yaml
run_as:
```

It first showed up as `run_as -> nil` in the dynamic configuration only
to later be converted to `run_as -> {}` while going through typed
conversion. We were using the presence of a key to initialize an empty
value. This is incorrect and it should have remained a nil value.

This conversion was happening in `convert.FromTyped` where any struct
always returned a map value. Instead, it should only return a map value
in any one of these cases: 1) the struct has elements, 2) the struct was
originally a map in the dynamic configuration, or 3) the struct was
initialized to a non-empty pointer value.

Stacked on top of #1516 and #1518.

## Tests

* Unit tests pass.
* Integration tests pass.
* Manually ran through bundle CRUD with a bundle without resources.
2024-06-21 13:43:21 +00:00
Pieter Noordhuis 01adef666a
Set bool pointer to disable lock (#1516)
## Changes

This cherry-picks from #1490 to address an issue that came up in #1511.

The function `dyn.SetByPath` requires intermediate values to be present.
If they are not, it returns an error that it cannot index a map. This is
not an issue on main, where the intermediate maps are always created,
even if they are not present in the dynamic configuration tree. As of
#1511, we'll no longer populate empty maps for empty structs if they are
not explicitly set (i.e., a non-nil pointer). This change writes a bool
pointer to avoid this issue altogether.

## Tests

Unit tests pass.
2024-06-21 11:14:33 +00:00
Gleb Kanterov 57a5a65f87
Add ApplyPythonMutator (#1430)
## Changes
Add ApplyPythonMutator, which will fork the Python subprocess and
process pipe bundle configuration through it.

It's enabled through `experimental` section, for example:

```yaml
experimental:
  pydabs: 
    enable: true
    venv_path: .venv
```

For now, it's limited to two phases in the mutator pipeline:

- `load`: adds new jobs
- `init`: adds new jobs, or modifies existing ones

It's enforced that no jobs are modified in `load` and not jobs are
deleted in `load/init`, because, otherwise, it will break existing
assumptions.

## Tests
Unit tests
2024-06-20 08:43:08 +00:00
Pieter Noordhuis b2c03ea54c
Use `dyn.InvalidValue` to indicate absence (#1507)
## Changes

Previously, the functions `Get` and `Index` returned `dyn.NilValue` to
indicate that a map key or sequence index wasn't found. This is a valid
value, so we need to differentiate between actual absence and a real
`dyn.NilValue`. We do this with the zero value of a `dyn.Value` (also
captured in the constant `dyn.InvalidValue`).

## Tests

* Unit tests.
* Renamed `Get` and `Index` to find and update all call sites.
2024-06-19 15:24:57 +00:00
Lennart Kats (databricks) deb3e365cd
Pause quality monitors when "mode: development" is used (#1481)
## Changes

Similar to scheduled jobs, quality monitors should be paused when in
development mode (in line with the [behavior for scheduled
jobs](https://docs.databricks.com/en/dev-tools/bundles/deployment-modes.html)).
@aravind-segu @arpitjasa-db please take a look and verify this behavior.

- [x] Followup: documentation changes. If we make this change we should
update
https://docs.databricks.com/dev-tools/bundles/deployment-modes.html.

## Tests
Unit tests
2024-06-19 13:54:35 +00:00
Andrew Nester 663aa9ab8c
Override variables with lookup value even if values has default value set (#1504)
## Changes

This PR fixes the behaviour when variables were not overridden with
lookup value from targets if these variables had any default value set
in the default target.

Fixes #1449 

## Tests
Added regression test
2024-06-19 08:03:06 +00:00
shreyas-goenka 553fdd1e81
Serialize dynamic value for `bundle validate` output (#1499)
## Changes
Using dynamic values allows us to retain references like
`${resources.jobs...}` even when the type of field is not integer, eg:
`run_job_task`, or in general values that do not map to the Go types for
a field.

## Tests
Integration test
2024-06-18 15:04:20 +00:00
shreyas-goenka 274688d8a2
Clean up unused code (#1502)
## Changes
1. Removes `DefaultMutatorsForTarget` which is no longer used anywhere
2. Makes SnapshotPath a private field. It's no longer needed by data
structures outside its package.

FYI, I also tried finding other instances of dead code but I could not
find anything else that was safe to remove. I used
https://go.dev/blog/deadcode to search for them, and the other instances
either implemented an interface, increased test coverage for some of our
other code paths or there was some other reason I could not remove them
(like autogenerated functions or used in tests).

Good sign our codebase is mostly clean (at least superficially).
2024-06-18 14:14:27 +00:00
shreyas-goenka 44e3928d6a
Avoid multiple file tree traversals on bundle deploy (#1493)
## Changes
To run bundle deploy from DBR we use an abstraction over the workspace
import / export APIs to create a `filer.Filer` and abstract the file
system. Walking the file tree in such a filer is expensive and requires
multiple API calls. This PR remove the two duplicate file tree walks
that happen by caching the result.
2024-06-17 09:48:52 +00:00
Pieter Noordhuis 311dfa4642
Upgrade TF provider to 1.47.0 (#1476)
## Changes

This includes a bugfix for provisioning jobs with `num_workers = 0`.

Fixes #1472.

## Tests

Manually tested this fixes the issue.
2024-06-05 11:33:43 +00:00
Pieter Noordhuis 70fd8ad3d7
Update OpenAPI spec (#1466)
## Changes

Notable changes:

* Pagination of account-level storage credentials
* Rename app deployment method

Go SDK release notes:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.42.0

## Tests

* Nightlies pass.
2024-06-03 14:14:48 +00:00
Pieter Noordhuis c9b4f11947
Update error checks that use the `os` package to use `errors.Is` (#1461)
## Changes

From the [documentation](https://pkg.go.dev/os#IsNotExist) on the
functions in the `os` package:
> This function predates errors.Is. It only supports errors returned by
the os package.
> New code should use errors.Is(err, fs.ErrNotExist).

This issue surfaced while working on using a different `vfs.Path`
implementation that uses errors from the `fs` package. Calls to
`os.IsNotExist` didn't return true for errors that wrap
`fs.ErrNotExist`.

## Tests

n/a
2024-06-03 12:39:36 +00:00
Pieter Noordhuis 30fd84893f
Generate bundle schema placeholder for quality monitors (#1465)
## Changes

Generated with default generation command.

The team is making a fix to ensure the proper comments are included
later.

## Tests

n/a
2024-06-03 12:39:27 +00:00
Aravind Segu a33d0c8bf9
Add support for Lakehouse monitoring in bundles (#1307)
## Changes

This change adds support for Lakehouse monitoring in bundles.

The associated resource type name is "quality monitor".

## Testing

Unit tests.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: Arpit Jasapara <87999496+arpitjasa-db@users.noreply.github.com>
2024-05-31 09:42:25 +00:00
Pieter Noordhuis 364a609ea7
Upgrade TF provider to 1.46.0 (#1460)
## Changes

Release notes in
https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.46.0

Notable changes since 1.43.0:
* The job resource has been migrated to the Go SDK. More fields are now
passed through from DABs into TF.
* Improved zero-value handling.

## Tests

n/a
2024-05-31 07:13:43 +00:00
Pieter Noordhuis 424499ec1d
Abstract over filesystem interaction with libs/vfs (#1452)
## Changes

Introduce `libs/vfs` for an implementation of `fs.FS` and friends that
_includes_ the absolute path it is anchored to.

This is needed for:
1. Intercepting file operations to inject custom logic (e.g., logging,
access control).
2. Traversing directories to find specific leaf directories (e.g.,
`.git`).
3. Converting virtual paths to OS-native paths.

Options 2 and 3 are not possible with the standard `fs.FS` interface.
They are needed such that we can provide an instance to the sync package
and still detect the containing `.git` directory and convert paths to
native paths.

This change focuses on making the following packages use `vfs.Path`:
* libs/fileset
* libs/git
* libs/sync

All entries returned by `fileset.All` are now slash-separated. This has
2 consequences:
* The sync snapshot now always uses slash-separated paths
* We don't need to call `filepath.FromSlash` as much as we did

## Tests

* All unit tests pass
* All integration tests pass
* Manually confirmed that a deployment made on Windows by a previous
version of the CLI can be deployed by a new version of the CLI while
retaining the validity of the local sync snapshot as well as the remote
deployment state.
2024-05-30 07:41:50 +00:00
Pieter Noordhuis 63ceede335
Update Go SDK to v0.41.0 (#1445)
## Changes

Release notes at
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.41.0.

## Tests

n/a
2024-05-22 07:41:32 +00:00
Andrew Nester 3f8036f2df
Fixed seg fault when specifying environment key for tasks (#1443)
## Changes
Fixed seg fault when specifying environment key for tasks
2024-05-21 10:00:04 +00:00
Andrew Nester a014d50a6a
Fixed panic when loading incorrectly defined jobs (#1402)
## Changes
If only key was defined for a job in YAML config, validate previously
failed with segfault.

This PR validates that jobs are correctly defined and returns an error
if not.

## Tests
Added regression test
2024-05-17 10:10:17 +00:00
Pieter Noordhuis dd94107853
Remove dependency on `ConfigFilePath` from path translation mutator (#1437)
## Changes

This is one step toward removing the `path.Paths` struct embedding from
resource types.

Going forward, we'll exclusively use the `dyn.Value` tree for location
information.

## Tests

Existing unit tests that cover path resolution with fallback behavior
pass.
2024-05-17 09:26:09 +00:00
Miles Yucht f7d4b272f4
Improve token refresh flow (#1434)
## Changes
Currently, there are a number of issues with the non-happy-path flows
for token refresh in the CLI.

If the token refresh fails, the raw error message is presented to the
user, as seen below. This message is very difficult for users to
interpret and doesn't give any clear direction on how to resolve this
issue.
```
Error: token refresh: Post "https://adb-<WSID>.azuredatabricks.net/oidc/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"}
```

When logging in again, I've noticed that the timeout for logging in is
very short, only 45 seconds. If a user is using a password manager and
needs to login to that first, or needs to do MFA, 45 seconds may not be
enough time. to an account-level profile, it is quite frustrating for
users to need to re-enter account ID information when that information
is already stored in the user's `.databrickscfg` file.

This PR tackles these two issues. First, the presentation of error
messages from `databricks auth token` is improved substantially by
converting the `error` into a human-readable message. When the refresh
token is invalid, it will present a command for the user to run to
reauthenticate. If the token fetching failed for some other reason, that
reason will be presented in a nice way, providing front-line debugging
steps and ultimately redirecting users to file a ticket at this repo if
they can't resolve the issue themselves. After this PR, the new error
message is:
```
Error: a new access token could not be retrieved because the refresh token is invalid. To reauthenticate, run `.databricks/databricks auth login --host https://adb-<WSID>.azuredatabricks.net`
```

To improve the login flow, this PR modifies `databricks auth login` to
auto-complete the account ID from the profile when present.
Additionally, it increases the login timeout from 45 seconds to 1 hour
to give the user sufficient time to login as needed.

To test this change, I needed to refactor some components of the CLI
around profile management, the token cache, and the API client used to
fetch OAuth tokens. These are now settable in the context, and a
demonstration of how they can be set and used is found in
`auth_test.go`.

Separately, this also demonstrates a sort-of integration test of the CLI
by executing the Cobra command for `databricks auth token` from tests,
which may be useful for testing other end-to-end functionality in the
CLI. In particular, I believe this is necessary in order to set flag
values (like the `--profile` flag in this case) for use in testing.

## Tests
Unit tests cover the unhappy and happy paths using the mocked API
client, token cache, and profiler.

Manually tested

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 10:22:09 +00:00
dependabot[bot] 216d2b058a
Bump github.com/databricks/databricks-sdk-go from 0.39.0 to 0.40.1 (#1431)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.39.0 to 0.40.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>v0.40.0</h2>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentStatus">serving.AppDeploymentStatus</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="72334ef449"><code>72334ef</code></a>
Release v0.40.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/920">#920</a>)</li>
<li><a
href="63343c8c66"><code>63343c8</code></a>
Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/919">#919</a>)</li>
<li><a
href="27d08a67df"><code>27d08a6</code></a>
Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/914">#914</a>)</li>
<li><a
href="b5322db1a7"><code>b5322db</code></a>
Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/897">#897</a>)</li>
<li><a
href="5edb2dfc92"><code>5edb2df</code></a>
Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/896">#896</a>)</li>
<li><a
href="0e8dba0f70"><code>0e8dba0</code></a>
Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/884">#884</a>)</li>
<li><a
href="b6e76fc44b"><code>b6e76fc</code></a>
Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/909">#909</a>)</li>
<li><a
href="70aa8eea19"><code>70aa8ee</code></a>
Release v0.40.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/908">#908</a>)</li>
<li><a
href="dae1642c5d"><code>dae1642</code></a>
Add ToLibraryList() for ClusterStatusResponse (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/906">#906</a>)</li>
<li><a
href="c6516aa754"><code>c6516aa</code></a>
Bump version of mockery (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/907">#907</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.39.0...v0.40.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.39.0&new-version=0.40.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 09:04:58 +00:00
Ilia Babanov 2035516fde
Don't merge-in remote resources during depolyments (#1432)
## Changes
`check_running_resources` now pulls the remote state without modifying
the bundle state, similar to how it was doing before. This avoids a
problem when we fail to compute deployment metadata for a deleted job
(which we shouldn't do in the first place)

`deploy_then_remove_resources_test` now also deploys and deletes a job
(in addition to a pipeline), which catches the error that this PR fixes.

## Tests
Unit and integ tests
2024-05-15 12:41:44 +00:00
Andrew Nester 0a21428a48
Upgrade to 1.43 terraform provider (#1429)
## Changes
Upgrade to 1.43 terraform provider
2024-05-14 12:19:34 +00:00
shreyas-goenka 63617253bd
Assert customer marshalling is implemented for resources (#1425)
## Changes
This PR ensures every resource implements a custom marshaller /
unmarshaller. This is required because we directly embed Go SDK structs.
which implement custom marshalling overrides. Since the struct is
embedded, the [customer marshalling
overrides](https://pkg.go.dev/encoding/json#example-package-CustomMarshalJSON)
are promoted to the top level. If the embedded struct itself is nil,
then JSON marshal / unmarshal will panic because it tries to call
`MarshalJSON` / `UnmarshalJSON` on a nil object.

Fixing this issue at the Go SDK level does not seem possible. Discussed
with @hectorcast-db.
2024-05-14 10:30:48 +00:00
shreyas-goenka 95bbe2ece1
Fix flaky tests for the parallel mutator (#1426)
## Changes
Around 0.5% to 1% of the time, the tests would fail due to concurrent
access to the underlying slice in the mutator. This PR makes the test
thread safe preventing race conditions.

Example of failed run:
https://github.com/databricks/cli/actions/runs/9004657555/job/24738145829
2024-05-13 12:16:43 +00:00
Andrew Nester a393c87ed9
Upgrade TF provider to 1.42.0 (#1418)
## Changes
Upgrade TF provider to 1.42.0

Also fixes #1258
2024-05-06 11:41:37 +00:00
shreyas-goenka 30215860e7
Fix description memoization in bundle schema (#1409)
## Changes
Fixes https://github.com/databricks/cli/issues/559

The CLI generation is now stable and does not produce a diff for the
`bundle_descriptions.json` file.

Before a pointer to the schema was stored in the memo, which would be
mutated later to include the description. This lead to duplicate
documentation for schema components that were used in multiple places.
This PR fixes this issue.

Eg: Before all references of `pause_status` would have the same
description.

## Tests
Added regression test.
2024-05-01 11:04:37 +00:00
shreyas-goenka 507053ee50
Annotate DLT pipelines when deployed using DABs (#1410)
## Changes
This PR annotates any pipelines that were deployed using DABs to have
`deployment.kind` set to "BUNDLE", mirroring the annotation for Jobs
(similar PR for jobs FYI: https://github.com/databricks/cli/pull/880).

Breakglass UI is not yet available for pipelines, so this annotation
will just be used for revenue attribution ATM.

Note: The API field has been deployed in all regions including GovCloud.

## Tests
Unit tests and manually.

Manually verified that the kind and metadata_file_path are being set by
DABs, and are returned by a GET API to a pipeline deployed using a DAB.
Example:
```
    "deployment": {
      "kind":"BUNDLE",
      "metadata_file_path":"/Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/state/metadata.json"
    },
```
2024-05-01 08:37:03 +00:00
Ilia Babanov 153141d3ea
Don't fail while parsing outdated terraform state (#1404)
`terraform show -json` (`terraform.Show()`) fails if the state file
contains resources with fields that non longer conform to the provider
schemas.

This can happen when you deploy a bundle with one version of the CLI,
then updated the CLI to a version that uses different databricks
terraform provider, and try to run `bundle run` or `bundle summary`.
Those commands don't recreate local terraform state (only `terraform
apply` or `plan` do) and terraform itself fails while parsing it.
[Terraform
docs](https://developer.hashicorp.com/terraform/language/state#format)
point out that it's best to use `terraform show` after successful
`apply` or `plan`.

Here we parse the state ourselves. The state file format is internal to
terraform, but it's more stable than our resource schemas. We only parse
a subset of fields from the state, and only update ID and ModifiedStatus
of bundle resources in the `terraform.Load` mutator.
2024-05-01 08:22:35 +00:00
dependabot[bot] 781688c9cb
Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 (#1405)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.38.0 to 0.39.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.39.0</h2>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7672dece38"><code>7672dec</code></a>
Release v0.39.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/901">#901</a>)</li>
<li><a
href="2f56ab8431"><code>2f56ab8</code></a>
Update SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/899">#899</a>)</li>
<li><a
href="fa3a5d24eb"><code>fa3a5d2</code></a>
Add retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/890">#890</a>)</li>
<li><a
href="219975c53f"><code>219975c</code></a>
Ignore flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/894">#894</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.38.0...v0.39.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.38.0&new-version=0.39.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-30 14:41:24 +00:00
shreyas-goenka d949f2b4f2
Fix bundle schema for variables (#1396)
## Changes
This PR fixes the variable schema to:
1. Allow non-string values in the "default" value of a variable.
2. Allow non-string overrides in a target for a variable. 

## Tests
Manually. There are no longer squiggly lines. 

Before:
<img width="329" alt="Screenshot 2024-04-24 at 3 26 43 PM"
src="https://github.com/databricks/cli/assets/88374338/43be02c2-80a4-4f80-bd79-0f3e1e93ee17">


After:
<img width="361" alt="Screenshot 2024-04-24 at 3 26 10 PM"
src="https://github.com/databricks/cli/assets/88374338/2c1fb892-a2a2-478b-8d2e-9bda6d844b54">
2024-04-25 11:23:50 +00:00
shreyas-goenka e652333103
Fix variable overrides in targets for non-string variables (#1397)
Before variable overrides that were not string in a target would not
work. This PR fixes that. Tested manually and via a unit test.
2024-04-25 11:21:10 +00:00
shreyas-goenka 6fd581d173
Allow variable references in non-string fields in the JSON schema (#1398)
## Tests
Verified manually.

Before:
<img width="373" alt="Screenshot 2024-04-24 at 7 18 44 PM"
src="https://github.com/databricks/cli/assets/88374338/b4aef51f-0c16-4589-9d47-cdec9ab91158">

After:
<img width="364" alt="Screenshot 2024-04-24 at 7 18 31 PM"
src="https://github.com/databricks/cli/assets/88374338/3d8e412e-77ee-4641-943d-f99eab26ba02">
<img width="356" alt="Screenshot 2024-04-24 at 7 16 54 PM"
src="https://github.com/databricks/cli/assets/88374338/2aed369a-3c6a-4754-9c76-0969423f319e">

Manually verified the schema diff is sane. Example:
```
<                               "type": "boolean",
<                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again."
---
>                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again.",
>                               "anyOf": [
>                                 {
>                                   "type": "boolean"
>                                 },
>                                 {
>                                   "type": "string",
>                                   "pattern": "\\$\\{([a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*)*)\\}"
>                                 }
>                               ]
```
2024-04-25 11:20:45 +00:00
Lennart Kats (databricks) 60122f6035
Show a better error message for using wheel tasks with older DBR versions (#1373)
## Changes

This is a minor improvement to the error about wheel tasks with older
DBR versions, since we get questions about it every now and then. It
also adds a pointer to the docs that were added since the original
messages was committed.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-23 19:36:25 +00:00
shreyas-goenka 1d9bf4b2c4
Add legacy option for `run_as` (#1384)
## Changes
This PR partially reverts the changes in
https://github.com/databricks/cli/pull/1233 and puts the old code under
an "experimental.use_legacy_run_as" configuration. This gives customers
who ran into the breaking change made in the PR a way out.


## Tests
Both manually and via unit tests.

Manually verified that run_as works for pipelines now. And if a user
wants to use the feature they need to be both a Metastore and a
workspace admin.

---------

Error when the deploying user is a workspace admin but not a metastore
admin:
```
Error: terraform apply: exit status 1

Error: cannot update permissions: User is not a metastore admin for Metastore 'deco-uc-prod-aws-us-east-1'.

  with databricks_permissions.pipeline_foo,
  on bundle.tf.json line 23, in resource.databricks_permissions.pipeline_foo:
  23:       }
```

--------

Output of bundle validate:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in databricks.yml:13:22

Name: bundle-playground
Target: default
Workspace:
  Host: https://dbc-a39a1eb1-ef95.cloud.databricks.com
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Found 1 warning
```
2024-04-22 11:51:41 +00:00
Pieter Noordhuis 3108883a8f
Processing and completion of positional args to bundle run (#1120)
## Changes

With this change, both job parameters and task parameters can be
specified as positional arguments to bundle run. How the positional
arguments are interpreted depends on the configuration of the job.

### Examples:

For a job that has job parameters configured a user can specify:

```
databricks bundle run my_job -- --param1=value1 --param2=value2
```

And the run is kicked off with job parameters set to:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

Similarly, for a job that doesn't use job parameters and only has
`notebook_task` tasks, a user can specify:

```
databricks bundle run my_notebook_job -- --param1=value1 --param2=value2
```

And the run is kicked off with task level `notebook_params` configured
as:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

For a job that doesn't doesn't use job parameters and only has either
`spark_python_task` or `python_wheel_task` tasks, a user can specify:

```
databricks bundle run my_python_file_job -- --flag=value other arguments
```

And the run is kicked off with task level `python_params` configured as:
```json
[
  "--flag=value",
  "other",
  "arguments"
]
```

The same is applied to jobs with only `spark_jar_task` or
`spark_submit_task` tasks.

## Tests

Unit tests. Tested the completions manually.
2024-04-22 11:50:13 +00:00
Andrew Nester 1872aa12b3
Added support for job environments (#1379)
## Changes
The main changes are:
1. Don't link artifacts to libraries anymore and instead just iterate
over all jobs and tasks when uploading artifacts and update local path
to remote
2. Iterating over `jobs.environments` to check if there are any local
libraries and checking that they exist locally
3. Added tests to check environments are handled correctly

End-to-end test will follow up

## Tests
Added regression test, existing tests (including integration one) pass
2024-04-22 11:44:34 +00:00
Lennart Kats (databricks) 000a7fef8c
Enable job queueing by default (#1385)
## Changes

This enable queueing for jobs by default, following the behavior from
API 2.2+. Queing is a best practice and will be the default in API 2.2.
Since we're still using API 2.1 which has queueing disabled by default,
this PR enables queuing using a mutator.

Customers can manually turn off queueing for any job by adding the
following to their job spec:

```
queue:
  enabled: false
```

## Tests

Unit tests, manual confirmation of property after deployment.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-22 10:36:39 +00:00
Pieter Noordhuis cd675ded9a
Update `testutil` helpers to return path (#1383)
## Changes

I spotted a few call sites where the path of a test file was synthesized
multiple times. It is easier to capture the path as a variable and reuse
it.
2024-04-19 15:05:36 +00:00
shreyas-goenka 6ca57a7e68
Add docs URL for `run_as` in error message (#1381) 2024-04-19 14:09:33 +00:00
shreyas-goenka e008c2bd8c
Cleanup remote file path on bundle destroy (#1374)
## Changes
The sync struct initialization would recreate the deleted `file_path`.
This PR moves to not initializing the sync object to delete the
snapshot, thus fixing the lingering `file_path` after `bundle destroy`.

## Tests
Manually, and a integration test to prevent regression.
2024-04-19 11:48:04 +00:00
shreyas-goenka 3c14204e98
Followup improvements to the Docker setup script (#1369)
## Changes
This PR:
1. Uses bash to run the setup.sh script instead of the native busybox sh
shipped with alpine.
2. Verifies the checksums of the installed terraform CLI binaries.
 
## Tests
Manually. The docker image successfully builds.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-04-18 20:52:11 +00:00
Andrew Nester 6b81b627fe
Upgrade terraform-provider-databricks to 1.40.0 (#1376)
## Changes
Upgrade terraform-provider-databricks to 1.40.0
2024-04-18 20:20:01 +00:00
Andrew Nester 27f51c760f
Added validate mutator to surface additional bundle warnings (#1352)
## Changes
All these validators will return warnings as part of `bundle validate`
run

Added 2 mutators: 
1. To check that if tasks use job_cluster_key it is actually defined
2. To check if there are any files to sync as part of deployment

Also added `bundle.Parallel` to run them in parallel

To make sure mutators under bundle.Parallel do not mutate config,
introduced new `ReadOnlyMutator`, `ReadOnlyBundle` and `ReadOnlyConfig`.

Example 

```
databricks bundle validate -p deco-staging
Warning: unknown field: new_cluster
  at resources.jobs.my_job
  in bundle.yml:24:7

Warning: job_cluster_key high_cpu_workload_job_cluster is not defined
  at resources.jobs.my_job.tasks[0].job_cluster_key
  in bundle.yml:35:28

Warning: There are no files to sync, please check your your .gitignore and sync.exclude configuration
  at sync.exclude
  in bundle.yml:18:5

Name: test
Target: default
Workspace:
  Host: https://acme.databricks.com
  User: andrew.nester@databricks.com
  Path: /Users/andrew.nester@databricks.com/.bundle/test/default

Found 3 warnings
```

## Tests
Added unit tests
2024-04-18 15:13:16 +00:00
Andrew Nester 542156c30b
Resolve variable references inside variable lookup fields (#1368)
## Changes
Allows for the syntax below

```
variables:
  service_principal_app_id:
    description: 'The app id of the service principal for running workflows as.'
    lookup:
      service_principal: "sp-${bundle.environment}"
```

Fixes #1259

## Tests
Added regression test
2024-04-18 09:56:16 +00:00
Lennart Kats (databricks) c3a7d17d1d
Disable locking for development mode (#1302)
## Changes

This changes `databricks bundle deploy` so that it skips the lock
acquisition/release step for a `mode: development` target:
* This saves about 2 seconds (measured over 100 runs on a quiet/busy
workspace).
* This helps avoid the `deploy lock acquired by lennart@company.com at
2024-02-28 15:48:38.40603 +0100 CET. Use --force-lock to override` error
* Risk: this may cause deployment conflicts, but since dev mode
deployments are always scoped to a user, that risk should be minimal

Update after discussion:
* This behavior can now be disabled via a setting.
* Docs PR: https://github.com/databricks/docs/pull/15873

## Measurements

### 100 deployments of the "python_default" project to an empty
workspace

_Before this branch:_
p50 time: 11.479 seconds
p90 time: 11.757 seconds

_After this branch:_
p50 time: 9.386 seconds
p90 time: 9.599 seconds

### 100 deployments of the "python_default" project to a busy (staging)
workspace

_Before this branch:_
* p50 time: 13.335 seconds
* p90 time: 15.295 seconds

_After this branch:_
* p50 time: 11.397 seconds
* p90 time: 11.743 seconds

### Typical duration of deployment steps

* Acquiring Deployment Lock: 1.096 seconds
* Deployment Preparations and Operations: 1.477 seconds
* Uploading Artifacts: 1.26 seconds
* Finalizing Deployment: 9.699 seconds
* Releasing Deployment Lock: 1.198 seconds

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
Co-authored-by: Andrew Nester <andrew.nester.dev@gmail.com>
2024-04-18 01:59:39 +00:00
dependabot[bot] c949655f9f
Bump github.com/databricks/databricks-sdk-go from 0.37.0 to 0.38.0 (#1361)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.37.0&new-version=0.38.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-16 12:03:21 +00:00
Andrew Nester ed56bbca16
Transform artifact files source patterns in build not upload stage (#1359)
## Changes
Transform artifact files source patterns in build not upload stage

Resolves the following warning
```
artifact section is not defined for file at /Users/andrew.nester/dabs/wheel/target/myjar.jar. Skipping uploading. In order to use the define 'artifacts' section
```

## Tests
Unit test pass
2024-04-12 16:00:42 +00:00
shreyas-goenka 5140a9a902
Add docker images for the CLI (#1353)
## Changes
This PR makes changes to support creating a docker image for the CLI
with the `terraform` dependencies built in. This is useful for customers
that operate in a network-restricted environment. Normally DABs makes
API calls to registry.terraform.io to setup the terraform dependencies,
with this setup the CLI/DABs will rely on the provider binaries bundled
in the docker image.

### Specifically this PR makes the following changes:
----------------
Modifies the CLI release workflow to publish the docker images in the
Github Container Registry. URL:
https://github.com/databricks/cli/pkgs/container/cli.

We use docker support in `goreleaser` to build and publish the images.
Using goreleaser ensures the CLI packaged in the docker image is the
same release artifact as the normal releases. For more information see:
1. https://goreleaser.com/cookbooks/multi-platform-docker-images
2. https://goreleaser.com/customization/docker/

Other choices made include:
1. Using `alpine` as the base image. The reason is `alpine` is a small
and lightweight linux distribution (~5MB) and an industry standard.
2. Not using [docker
manifest](https://docs.docker.com/reference/cli/docker/manifest) to
create a multi-arch build. This is because the functionality is still
experimental.

------------------

Make the `DATABRICKS_TF_VERSION` and `DATABRICKS_TF_PROVIDER_VERSION`
environment variables optional for using the terraform file mirror.
While it's not strictly necessary to make the docker image work, it's
the "right" behaviour and reduces complexity. The rationale is:
- These environment variables here are needed so the Databricks CLI does
not accidentally use the file mirror bundled with VSCode if it's
incompatible. This does not require the env vars to be mandatory.
context: https://github.com/databricks/cli/pull/1294
- This makes the `Dockerfile` and `setup.sh` simpler. We don't need an
[entrypoint.sh script to set the version environment
variables](https://medium.com/@leonardo5621_66451/learn-how-to-use-entrypoint-scripts-in-docker-images-fede010f172d).
This also makes using an interactive terminal with `docker run -it ...`
work out of the box.

 
## Tests


Tested manually. 

--------------------

To test the release pipeline I triggered a couple of dummy releases and
verified that the images are built successfully and uploaded to Github.
1. https://github.com/databricks/cli/pkgs/container/cli
3. workflow for release:
https://github.com/databricks/cli/actions/runs/8646106333

--------------------

I tested the docker container itself by setting up
[Charles](https://www.charlesproxy.com/) as an HTTP proxy and verifying
that no HTTP requests are made to `registry.terraform.io`

Before:
FYI, The Charles web proxy is hosted at localhost:8888.
```
shreyas.goenka@THW32HFW6T bundle-playground % rm -r .databricks 
shreyas.goenka@THW32HFW6T bundle-playground %  HTTP_PROXY="http://localhost:8888" HTTPS_PROXY="http://localhost:8888" cli bundle deploy
Uploading bundle files to /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```
<img width="1275" alt="Screenshot 2024-04-11 at 3 21 45 PM"
src="https://github.com/databricks/cli/assets/88374338/15f37324-afbd-47c0-a40e-330ab232656b">

After:
This time bundle deploy is run from inside the docker container. We use
`host.docker.internal` to map to localhost on the host machine, and -v
to mount the host file system as a volume.
```
shreyas.goenka@THW32HFW6T bundle-playground % docker run -v ~/projects/bundle-playground:/bundle -v ~/.databrickscfg:/root/.databrickscfg  -it --entrypoint /bin/sh -e HTTP_PROXY="http://host.docker.internal:8888" -e HTTPS_PROXY="http://host.docker.internal:8888" --network host ghcr.io/databricks/cli:latest-arm64           
/ # cd /bundle/
/bundle # rm -r .databricks/
/bundle # databricks bundle deploy
Uploading bundle files to /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

<img width="1275" alt="Screenshot 2024-04-11 at 3 22 54 PM"
src="https://github.com/databricks/cli/assets/88374338/2a8f097e-734b-4b3e-8075-c02e98a1b275">
2024-04-12 15:22:30 +00:00
Gleb Kanterov e42156411b
Fix compute override for foreach tasks (#1357)
## Changes
Fix compute override for foreach tasks.

```
$ databricks bundle deploy --compute-id=xxx
```

## Tests
I added unit tests
2024-04-12 09:53:29 +00:00
Andrew Nester d914a1b1e2
Do not emit warning on YAML anchor blocks (#1354)
## Changes
In 0.217.0 we started to emit warning on unknown fields in YAML
configuration but wrongly considered YAML anchor blocks as unknown
field.

This PR fixes this by skipping normalising of YAML blocks.

## Tests
Added regression tests
2024-04-10 09:55:02 +00:00
Andrew Nester 50d3bb4d56
Execute preinit after entry point to make sure scripts are loaded (#1351)
## Changes
Execute preinit after entry point to make sure scripts are loaded
2024-04-08 14:32:21 +00:00
Andrew Nester 2f4c0c1b56
Fixed pre-init script order (#1348)
## Changes
`preinit` script needs to be executed before processing configuration
files to allow the script to modify the configuration or add own
configuration files.
2024-04-08 13:28:38 +00:00
Andrew Nester 77ff994d1b
Correctly transform libraries in for_each_task block (#1340)
## Changes
Now DABs correctly transforms and deploys libraries in for_each_task
block

```
tasks:
  - task_key: my_loop
     for_each_task:
       inputs: "[1,2,3]"
         task:
           task_key: my_loop_iteration 
           libraries:
             - pypi:
                  package: my_package
```

## Tests
Added regression test
2024-04-05 15:52:39 +00:00
shreyas-goenka 7d1bab7cf0
Bump internal terraform provider version to `1.39` (#1339) 2024-04-05 14:49:04 +00:00
Pieter Noordhuis a95b1c7dcf
Retain location information of variable reference (#1333)
## Changes

Variable substitution works as if the variable reference is literally
replaced with its contents.

The following fields should be interpreted in the same way regardless of
where the variable is defined:
```yaml
foo: ${var.some_path}
bar: "./${var.some_path}"
```

Before this change, `foo` would inherit the location information of the
variable definition. After this change, it uses the location information
of the variable reference, making the behavior for `foo` and `bar`
identical.

Fixes #1330.

## Tests

The new test passes only with the fix.
2024-04-03 10:40:29 +00:00
dependabot[bot] f28a9d7107
Bump github.com/databricks/databricks-sdk-go from 0.36.0 to 0.37.0 (#1326)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.36.0&new-version=0.37.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-03 10:39:53 +00:00
Andrew Nester 8c144a2de4
Added `auth describe` command (#1244)
## Changes
This command provide details on auth configuration user is using as well
as authenticated user and auth mechanism used.

Relies on https://github.com/databricks/databricks-sdk-go/pull/838
(tests will fail until merged)

Examples of output

```
Workspace: https://test.com
User: andrew.nester@databricks.com
Authenticated with: pat
-----
Configuration:
  ✓ auth_type: pat
  ✓ host: https://test.com (from bundle)
  ✓ profile: DEFAULT (from --profile flag)
  ✓ token: ******** (from /Users/andrew.nester/.databrickscfg config file)
```

```
DATABRICKS_AUTH_TYPE=azure-msi databricks auth describe -p "Azure 2"
Unable to authenticate: inner token: Post "https://foobar.com/oauth2/token": AADSTS900023: Specified tenant identifier foobar_aaaaaaa' is neither a valid DNS name, nor a valid external domain. See https://login.microsoftonline.com/error?code=900023
-----
Configuration:
  ✓ auth_type: azure-msi (from DATABRICKS_AUTH_TYPE environment variable)
  ✓ azure_client_id: 8470f3ba-aaaa-bbbb-cccc-xxxxyyyyzzzz (from /Users/andrew.nester/.databrickscfg config file)
  ~ azure_client_secret: ******** (from /Users/andrew.nester/.databrickscfg config file, not used for auth type azure-msi)
  ~ azure_tenant_id: foobar_aaaaaaa (from /Users/andrew.nester/.databrickscfg config file, not used for auth type azure-msi)
  ✓ azure_use_msi: true (from /Users/andrew.nester/.databrickscfg config file)
  ✓ host: https://foobar.com (from /Users/andrew.nester/.databrickscfg config file)
  ✓ profile: Azure 2 (from --profile flag)
```

For account

```
Unable to authenticate: default auth: databricks-cli: cannot get access token: Error: token refresh: Post "https://xxxxxxx.com/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"}
. Config: host=https://xxxxxxx.com, account_id=ed0ca3c5-fae5-4619-bb38-eebe04a4af4b, profile=ACCOUNT-ed0ca3c5-fae5-4619-bb38-eebe04a4af4b
-----
Configuration:
  ✓ account_id: ed0ca3c5-fae5-4619-bb38-eebe04a4af4b (from /Users/andrew.nester/.databrickscfg config file)
  ✓ auth_type: databricks-cli (from /Users/andrew.nester/.databrickscfg config file)
  ✓ host: https://xxxxxxxxx.com (from /Users/andrew.nester/.databrickscfg config file)
  ✓ profile: ACCOUNT-ed0ca3c5-fae5-4619-bb38-eebe04a4af4b
```

## Tests
Added unit tests

---------

Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2024-04-03 08:14:04 +00:00
Ilia Babanov 079c416f8d
Add `bundle debug terraform` command (#1294)
- Add `bundle debug terraform` command. It prints versions of the
Terraform and the Databricks Terraform provider. In the text mode it
also explains how to setup the CLI in environments with restricted
internet access.
- Use `DATABRICKS_TF_EXEC_PATH` env var to point Databricks CLI to the
Terraform binary. The CLI only uses it if `DATABRICKS_TF_VERSION`
matches the currently used terraform version.
- Use `DATABRICKS_TF_CLI_CONFIG_FILE` env var to point Terraform CLI
config that points to the filesystem mirror for the Databricks provider.
The CLI only uses it if `DATABRICKS_TF_PROVIDER_VERSION` matches the
currently used provider version.


Relevant PR on the VSCode extension side:
https://github.com/databricks/databricks-vscode/pull/1147

Example output of the `databricks bundle debug terraform`:
```
Terraform version: 1.5.5
Terraform URL: https://releases.hashicorp.com/terraform/1.5.5

Databricks Terraform Provider version: 1.38.0
Databricks Terraform Provider URL: https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.38.0

Databricks CLI downloads its Terraform dependencies automatically.

If you run the CLI in an air-gapped environment, you can download the dependencies manually and set these environment variables:

  DATABRICKS_TF_VERSION=1.5.5
  DATABRICKS_TF_EXEC_PATH=/path/to/terraform/binary
  DATABRICKS_TF_PROVIDER_VERSION=1.38.0
  DATABRICKS_TF_CLI_CONFIG_FILE=/path/to/terraform/cli/config.tfrc

Here is an example *.tfrc configuration file:

  disable_checkpoint = true
  provider_installation {
    filesystem_mirror {
      path = "/path/to/a/folder/with/databricks/terraform/provider"
    }
  }

The filesystem mirror path should point to the folder with the Databricks Terraform Provider. The folder should have this structure: /registry.terraform.io/databricks/databricks/terraform-provider-databricks_1.38.0_ARCH.zip

For more information about filesystem mirrors, see the Terraform documentation: https://developer.hashicorp.com/terraform/cli/config/config-file#filesystem_mirror
```

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-04-02 12:56:27 +00:00
Andrew Nester 56e393c743
Allow specifying CLI version constraints required to run the bundle (#1320)
## Changes
Allow specifying CLI version constraints required to run the bundle

Example of configuration:

#### only allow specific version
```
bundle:
  name: my-bundle
  databricks_cli_version: "0.210.0"
```

#### allow all patch releases
```
bundle:
  name: my-bundle
  databricks_cli_version: "0.210.*"
```

#### constrain minimum version
```
bundle:
  name: my-bundle
  databricks_cli_version: ">= 0.210.0"
```

#### constrain range
```
bundle:
  name: my-bundle
  databricks_cli_version: ">= 0.210.0, <= 1.0.0"
```

For other examples see:
https://github.com/Masterminds/semver?tab=readme-ov-file#checking-version-constraints

Example error
```
sh-3.2$ databricks bundle validate
Error: Databricks CLI version constraint not satisfied. Required: >= 1.0.0, current: 0.216.0
```
## Tests
Added unit test cover all possible configuration permutations

---------

Co-authored-by: Lennart Kats (databricks) <lennart.kats@databricks.com>
2024-04-02 12:55:21 +00:00
shreyas-goenka cddc5f97f8
Fix the generated DABs JSON schema (#1322)
## Changes
This PR fixes bundle schema being broken because `for_each_task: null`
was set in the generated schema. This is not valid according to the JSON
schema specification and thus the Red Hat YAML VSCode extension was
failing to parse the YAML configuration.

This PR fixes: https://github.com/databricks/cli/issues/1312

## Tests
The fix itself was tested manually. I asserted that the autocompletion
works now. This was mistakenly overlooked the first time around when the
regression was introduced in https://github.com/databricks/cli/pull/1204
because the YAML extension provides best-effort autocomplete suggestions
even if the JSON schema fails to load.

To prevent future regressions we also add a test to assert that the JSON
schema generated itself is a valid JSON schema object. This is done via
using the `ajv-cli` to validate the schema. This package is also used by
the Red Hat YAML extension and thus provides a high fidelity check for
ensuring the JSON schema is valid.

Before, with the old schema:
```
shreyas.goenka@THW32HFW6T cli-versions % ajv validate -s proj/schema-216.json -d ../bundle-playground-3/databricks.yml
schema proj/schema-216.json is invalid
error: schema is invalid: data/properties/resources/properties/jobs/additionalProperties/properties/tasks/items/properties/for_each_task must be object,boolean, data/properties/resources/properties/jobs/additionalProperties/properties/tasks/items must be array, data/properties/resources/properties/jobs/additionalProperties/properties/tasks/items must match a schema in anyOf
```

After, with the new schema:
```
shreyas.goenka@THW32HFW6T cli-versions % ajv validate -s proj/schema-dev.json -d ../bundle-playground-3/databricks.yml
../bundle-playground-3/databricks.yml valid
```

After, autocomplete suggestions:

<img width="600" alt="Screenshot 2024-03-27 at 6 35 57 PM"
src="https://github.com/databricks/cli/assets/88374338/d0a62402-e323-4f36-854d-332b33cbeab8">
2024-03-28 11:25:36 +00:00
Pieter Noordhuis eea34b2504
Return diagnostics from `config.Load` (#1324)
## Changes

We no longer need to store load diagnostics on the `config.Root` type
itself and instead can return them from the `config.Load` call directly.
It is up to the caller of this function to append them to previous
diagnostics, if any.

Background: previous commits moved configuration loading of the entry
point into a mutator, so now all diagnostics naturally flow from
applying mutators.

This PR depends on #1319.

## Tests

Unit and manual validation of the debug statements in the validate
command.
2024-03-28 10:59:03 +00:00
shreyas-goenka 5df4c7e134
Add allow list for resources when bundle `run_as` is set (#1233)
## Changes
This PR introduces an allow list for resource types that are allowed
when the run_as for the bundle is not the same as the current deployment
user.

This PR also adds a test to ensure that any new resources added to DABs
will have to add the resource to either the allow list or add an error
to fail when run_as identity is not the same as deployment user.

## Tests
Unit tests
2024-03-27 16:13:53 +00:00
shreyas-goenka 704d069459
Make `bundle.deployment` optional in the bundle schema (#1321)
## Changes
Makes the field optional by adding the `omitempty` tag.

This gets rid of the red squiggly lines in the bundle schema.
2024-03-27 13:37:59 +00:00
Pieter Noordhuis ca534d596b
Load bundle configuration from mutator (#1318)
## Changes

Prior to this change, the bundle configuration entry point was loaded
from the function `bundle.Load`. Other configuration files were only
loaded once the caller applied the first set of mutators. This
separation was unnecessary and not ideal in light of gathering
diagnostics while loading _any_ configuration file, not just the ones
from the includes.

This change:
* Updates `bundle.Load` to only verify that the specified path is a
valid bundle root.
* Moves mutators that perform loading to `bundle/config/loader`.
* Adds a "load" phase that takes the place of applying
`DefaultMutators`.

Follow ups:
* Rename `bundle.Load` -> `bundle.Find` (because it no longer performs
loading)

This change depends on #1316 and #1317.

## Tests

Tests pass.
2024-03-27 10:49:05 +00:00
Pieter Noordhuis f195b84475
Remove support for DATABRICKS_BUNDLE_INCLUDES (#1317)
## Changes

PR #604 added functionality to load a bundle without a `databricks.yml`
if both the `DATABRICKS_BUNDLE_ROOT` and `DATABRICKS_BUNDLE_INCLUDES`
environment variables were set. We never ended up using this in
downstream tools so this can be removed.

## Tests

Unit tests pass.
2024-03-27 10:13:54 +00:00
Pieter Noordhuis 00d76d5afa
Move path field to bundle type (#1316)
## Changes

The bundle path was previously stored on the `config.Root` type under
the assumption that the first configuration file being loaded would set
it. This is slightly counterintuitive and we know what the path is upon
construction of the bundle. The new location for this property reflects
this.

## Tests

Unit tests pass.
2024-03-27 09:03:24 +00:00
Pieter Noordhuis ed194668db
Return `diag.Diagnostics` from mutators (#1305)
## Changes

This diagnostics type allows us to capture multiple warnings as well as
errors in the return value. This is a preparation for returning
additional warnings from mutators in case we detect non-fatal problems.

* All return statements that previously returned an error now return
`diag.FromErr`
* All return statements that previously returned `fmt.Errorf` now return
`diag.Errorf`
* All `err != nil` checks now use `diags.HasError()` or `diags.Error()`

## Tests

* Existing tests pass.
* I confirmed no call site under `./bundle` or `./cmd/bundle` uses
`errors.Is` on the return value from mutators. This is relevant because
we cannot wrap errors with `%w` when calling `diag.Errorf` (like
`fmt.Errorf`; context in https://github.com/golang/go/issues/47641).
2024-03-25 14:18:47 +00:00
Pieter Noordhuis 1b879d44e1
Upgrade Terraform provider to 1.38.0 (#1308)
## Changes

Update to the latest release. No schema changes.

## Tests

Unit tests pass. Integration to be done as part of the release PR.
2024-03-25 09:17:52 +00:00
Pieter Noordhuis fd8dbff631
Update Go SDK to v0.36.0 (#1304)
## Changes

SDK release:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.36.0

No notable differences other than a few type name changes.

## Tests

Tests pass.
2024-03-22 13:15:54 +00:00
Pieter Noordhuis f202596a6f
Move bundle tests into bundle/tests (#1299)
## Changes

These tests were located in `bundle/tests/bundle` which meant they were
unable to reuse the helper functions defined in the `bundle/tests`
package. There is no need for these tests to live outside the package.

## Tests

Existing tests pass.
2024-03-21 10:37:05 +00:00
Pieter Noordhuis 0ef93c2502
Update Go SDK to v0.35.0 (#1300)
## Changes

SDK release:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.35.0

## Tests

Tests pass.
2024-03-20 13:57:53 +00:00
Andrew Nester de89af6f8c
Push deployment state right after files upload (#1293)
## Changes
Push deployment state right after files upload

## Tests
Integration tests succeed
2024-03-19 09:47:41 +00:00
Pieter Noordhuis 7c4b34945c
Rewrite relative paths using `dyn.Location` of the underlying value (#1273)
## Changes

This change addresses the path resolution behavior in resource
definitions. Previously, all paths were resolved relative to where the
resource was first defined, which could lead to confusion and errors
when paths were specified in different directories. The new behavior is
to resolve paths relative to where they are defined, making it more
intuitive.

However, to avoid breaking existing configurations, compatibility with
the old behavior is maintained.

## Tests

* Existing unit tests for path translation pass.
* Additional test to cover both the nominal and the fallback behavior.
2024-03-18 16:23:39 +00:00
Andrew Nester d216404f27
Do CheckRunningResource only after terraform.Write (#1292)
## Changes
CheckRunningResource does `terraform.Show` which (I believe) expects
valid `bundle.tf.json` which is only written as part of
`terraform.Write` later.

With this PR order is changed.

Fixes #1286 

## Tests
Added regression E2E test
2024-03-18 15:39:18 +00:00
Andrew Nester 1b0ac61093
Added deployment state for bundles (#1267)
## Changes
This PR introduces new structure (and a file) being used locally and
synced remotely to Databricks workspace to track bundle deployment
related metadata.

The state is pulled from remote, updated and pushed back remotely as
part of `bundle deploy` command.

This state can be used for deployment sequencing as it's `Version` field
is monotonically increasing on each deployment.

Currently, it only tracks files being synced as part of the deployment.

This helps fix the issue with files not being removed during deployments
on CI/CD as sync snapshot was never present there.

Fixes #943 

## Tests
Added E2E (regression) test for files removal on CI/CD

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-03-18 14:41:58 +00:00
shreyas-goenka d4329f470f
Add integration test for mlops-stacks initialization (#1155)
## Changes
This PR:
1. Adds an integration test for mlops-stacks that checks the
initialization and deployment of the project was successful.
2. Fixes a bug in the initialization of templates from non-tty. We need
to process the input parameters in order since their descriptions can
refer to input parameters that came before in the interactive UX.

## Tests
The integration test passes in CI.
2024-03-12 14:15:54 +00:00
Andrew Nester c7818560ca
Add usage string when command fails with incorrect arguments (#1276)
## Changes
Add usage string when command fails with incorrect arguments

Fixes #1119

## Tests
Example output

```
> databricks libraries cluster-status 
Error: accepts 1 arg(s), received 0

Usage:
  databricks libraries cluster-status CLUSTER_ID [flags]

Flags:
  -h, --help   help for cluster-status

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
```
2024-03-12 14:12:34 +00:00
Pieter Noordhuis 4a9a12af19
Retain location annotation when expanding globs for pipeline libraries (#1274)
## Changes

We now keep location metadata associated with every configuration value.
When expanding globs for pipeline libraries, this annotation was erased
because of the conversion to/from the typed structure. This change
modifies the expansion mutator to work with `dyn.Value` and retain the
location of the value that holds the glob pattern.

## Tests

Unit tests pass.
2024-03-11 21:59:36 +00:00
shreyas-goenka d5dc2bd1ca
Filter current user from resource permissions (#1262)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration at deploy time, allowing users to define permissions for
their own identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

This PR is a reincarnation of
https://github.com/databricks/cli/pull/1145. The earlier attempt had to
be reverted due to metadata loss converting to and from the dynamic
configuration representation (reverted here:
https://github.com/databricks/cli/pull/1179)

## Tests
Unit test and manually
2024-03-11 15:05:15 +00:00
Pieter Noordhuis c05c0cd941
Include `dyn.Path` as argument to the visit callback function (#1260)
## Changes

This change means the callback supplied to `dyn.Foreach` can introspect
the path of the value it is being called for. It also prepares for
allowing visiting path patterns where the exact path is not known
upfront.

## Tests

Unit tests.
2024-03-07 13:56:50 +00:00
Pieter Noordhuis 74b1e05ed7
Update Go SDK to v0.34.0 (#1256)
## Changes

SDK release
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.34.0

This incorporates two changes to the generation code:
* Use explicit empty check for response types (see
https://github.com/databricks/databricks-sdk-go/pull/831)
* Support subservices for the settings commands (see
https://github.com/databricks/databricks-sdk-go/pull/826)

As part of the subservices support, this change also updates how methods
are registered with their services. This used to be done with `init`
functions and now through inline function calls. This should have a
(negligible) positive impact on binary start time because we no longer
have to call as many `init` functions.

## Tests

tbd
2024-03-06 09:53:44 +00:00
Andrew Nester ecf9c52f61
Support relative paths in artifact files source section and always upload all artifact files (#1247)
Support relative paths in artifact files source section and always
upload all artifact files

Fixes #1156

## Tests
Added unit tests
2024-03-04 20:28:15 +00:00
Andrew Nester 09d1846e13
Return `application_id` for service principal lookups (#1245)
## Changes
Return ApplicationId for service principals lookups

Fixes #1234 

## Tests
Added (regression) tests
2024-03-04 16:12:10 +00:00
Andrew Nester 29ab96f327
Only transform wheel libraries when using trampoline (#1248)
## Changes
Only transform wheel libraries when using trampoline

## Tests
Added regression test
2024-03-04 12:34:03 +00:00
Pieter Noordhuis 04827688fb
Add `--validate-only` flag to run validate-only pipeline update (#1251)
## Changes

This flag starts a "validation-only" update.

## Tests

Unit and manual confirmation it does what it should.
2024-03-04 08:38:32 +00:00
Ilia Babanov d12f88e24d
Fix summary command when internal terraform config doesn't exist (#1242)
Check if `bundle.tf.json` doesn't exist and create it before executing
`terraform init` (inside `terraform.Load`)

Fixes a problem when during `terraform.Load` it fails with: 
```
Error: Failed to load plugin schemas

Error while loading schemas for plugin components: Failed to obtain provider
schema: Could not load the schema for provider
registry.terraform.io/databricks/databricks: failed to instantiate provider
"registry.terraform.io/databricks/databricks" to obtain schema: unavailable
provider "registry.terraform.io/databricks/databricks"..
```
2024-03-01 08:25:12 +00:00
Andrew Nester 0839e6f66a
Added test to verify scripts.Execute mutator works correctly (#1237)
## Changes
Follow up to https://github.com/databricks/cli/pull/1232
2024-02-26 10:08:03 +00:00
Andrew Nester 1dbc086e5a
Upgrade Terraform provider to 1.37.0 (#1235)
## Changes
Upgrade Terraform provider to 1.37.0

Currently we're using 1.36.2 version which uses Go SDK 0.30 which does
not have U2M enabled for all clouds.
Upgrading to 1.37.0 allows TF provider (and thus DABs) to use U2M

Fixes #1231
2024-02-23 10:41:42 +00:00
Andrew Nester 1588a14d07
Add correct tag value for models in dev mode (#1230)
## Changes
Fixes #922

## Tests
Added regression test case
2024-02-22 14:52:49 +00:00
Miles Yucht b65ce75c1f
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes
Currently, when the CLI run a list API call (like list jobs), it uses
the `List*All` methods from the SDK, which list all resources in the
collection. This is very slow for large collections: if you need to list
all jobs from a workspace that has 10,000+ jobs, you'll be waiting for
at least 100 RPCs to complete before seeing any output.

Instead of using List*All() methods, the SDK recently added an iterator
data structure that allows traversing the collection without needing to
completely list it first. New pages are fetched lazily if the next
requested item belongs to the next page. Using the List() methods that
return these iterators, the CLI can proactively print out some of the
response before the complete collection has been fetched.

This involves a pretty major rewrite of the rendering logic in `cmdio`.
The idea there is to define custom rendering logic based on the type of
the provided resource. There are three renderer interfaces:

1. textRenderer: supports printing something in a textual format (i.e.
not JSON, and not templated).
2. jsonRenderer: supports printing something in a pretty-printed JSON
format.
3. templateRenderer: supports printing something using a text template.

There are also three renderer implementations:

1. readerRenderer: supports printing a reader. This only implements the
textRenderer interface.
2. iteratorRenderer: supports printing a `listing.Iterator` from the Go
SDK. This implements jsonRenderer and templateRenderer, buffering 20
resources at a time before writing them to the output.
3. defaultRenderer: supports printing arbitrary resources (the previous
implementation).

Callers will either use `cmdio.Render()` for rendering individual
resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an
iterator. This separate method is needed to safely be able to match on
the type of the iterator, since Go does not allow runtime type matches
on generic types with an existential type parameter.

One other change that needs to happen is to split the templates used for
text representation of list resources into a header template and a row
template. The template is now executed multiple times for List API
calls, but the header should only be printed once. To support this, I
have added `headerTemplate` to `cmdIO`, and I have also changed
`RenderWithTemplate` to include a `headerTemplate` parameter everywhere.

## Tests
- [x] Unit tests for text rendering logic
- [x] Unit test for reflection-based iterator construction.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-21 14:16:36 +00:00
dependabot[bot] d9f34e6b22
Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 (#1222)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.32.0 to 0.33.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
<p>OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date:
2024-02-19</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eba5c8b3ae"><code>eba5c8b</code></a>
Release v0.33.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/823">#823</a>)</li>
<li><a
href="6846045a98"><code>6846045</code></a>
Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/819">#819</a>)</li>
<li><a
href="c6a803ae18"><code>c6a803a</code></a>
Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/822">#822</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-19 14:30:06 +00:00
Pieter Noordhuis a2a4948047
Allow use of variables references in primitive non-string fields (#1219)
## Changes

This change enables the use of bundle variables for boolean, integer,
and floating point fields.

## Tests

* Unit tests.
* I ran a manual test to confirm parameterizing the number of workers in
a cluster definition works.
2024-02-19 10:44:51 +00:00
Pieter Noordhuis f70ec359dc
Use `dyn.Value` as input to generating Terraform JSON (#1218)
## Changes

This builds on #1098 and uses the `dyn.Value` representation of the
bundle configuration to generate the Terraform JSON definition of
resources in the bundle.

The existing code (in `BundleToTerraform`) was not great and in an
effort to slightly improve this, I added a package `tfdyn` that includes
dedicated files for each resource type. Every resource type has its own
conversion type that takes the `dyn.Value` of the bundle-side resource
and converts it into Terraform resources (e.g. a job and optionally its
permissions).

Because we now use a `dyn.Value` as input, we can represent and emit
zero-values that have so far been omitted. For example, setting
`num_workers: 0` in your bundle configuration now propagates all the way
to the Terraform JSON definition.

## Tests

* Unit tests for every converter. I reused the test inputs from
`convert_test.go`.
* Equivalence tests in every existing test case checks that the
resulting JSON is identical.
* I manually compared the TF JSON file generated by the CLI from the
main branch and from this PR on all of our bundles and bundle examples
(internal and external) and found the output doesn't change (with the
exception of the odd zero-value being included by the version in this
PR).
2024-02-16 20:54:38 +00:00
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Pieter Noordhuis 788ec81785
Use `any` as type for data sources and resources in `tf/schema` (#1216)
## Changes

We plan to use the any-equivalent of a `dyn.Value` such that we can use
variable references for non-string fields (e.g.
`${databricks_job.some_job.id}` where an integer is expected), as well
as properly emit zero values for primitive types (e.g. 0 for integers or
false for booleans).

This change is in preparation for the above.

## Tests

Unit tests.
2024-02-16 12:46:24 +00:00
Pieter Noordhuis ffae10d904
Bump Terraform provider to v1.36.2 (#1215)
## Changes

* Update `go.mod` with latest dependencies
* Update `go.mod` to require Go 1.21 to match root `go.mod`
* Regenerate structs for Terraform provider v1.36.2

## Tests

n/a
2024-02-16 07:05:45 +00:00
dependabot[bot] 299e9b56a6
Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 (#1199)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.30.1 to 0.32.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-15 14:52:17 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Miles Yucht e8b0698e19
Regenerate the CLI using the same OpenAPI spec as the SDK (#1205)
## Changes
The OpenAPI spec used to generate the CLI doesn't match the version used
for the SDK version that the CLI currently depends on. This PR
regenerates the CLI based on the same version of the OpenAPI spec used
by the SDK on v0.30.1.

## Tests
<!-- How is this tested? -->
2024-02-13 14:33:59 +00:00
shreyas-goenka 52b813bd8e
Skip `for_each_task` when generating the bundle schema (#1204)
## Changes
Bundle schema generation does not support recursive API fields. This PR
skips generation for for_each_task until we add proper support for
recursive types in the bundle schema.

## Tests
Manually. This fixes the generation of the CLI and the bundle schema
command works as expected, with the sub-schema for `for_each_task` being
set to null in the output.
```
"for_each_task": null,
```
2024-02-13 14:13:47 +00:00
Andrew Nester bc30c9ed4a
Added `--restart` flag for `bundle run` command (#1191)
## Changes
Added `--restart` flag for `bundle run` command

When running with this flag, `bundle run` will cancel all existing runs
before starting a new one

## Tests
Manually
2024-02-09 14:33:14 +00:00
Pieter Noordhuis 4073e45d4b
Use mockery to generate mocks compatible with testify/mock (#1190)
## Changes

This is the same approach we use in the Go SDK.

## Tests

Tests pass.
2024-02-08 15:18:53 +00:00
Pieter Noordhuis f7d1a5862d
Use allowlist for Git-related fields to include in metadata (#1187)
## Changes

When new fields are added they should not automatically propagate to the
bundle metadata.

## Tests

Test passes.
2024-02-08 12:23:14 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Andrew Nester de363faa53
Make sure grouped flags are added to the command flag set (#1180)
## Changes
Make sure grouped flags are added to the command flag set

## Tests
Added regression tests
2024-02-07 10:27:13 +00:00
Pieter Noordhuis 6e075e8cf8
Revert "Filter current user from resource permissions (#1145)" (#1179)
## Changes

This reverts commit 4131069a4b.

The integration test for metadata computation failed. The back and forth
to `dyn.Value` erases unexported fields that the code currently still
depends on. We'll have to retry on top of #1098.
2024-02-07 09:22:44 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
shreyas-goenka 4131069a4b
Filter current user from resource permissions (#1145)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration, allowing users to define permissions for their own
identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

## Tests
Unit test and manually
2024-02-06 12:45:08 +00:00
Pieter Noordhuis 33c446dadd
Refactor library to artifact matching to not use pointers (#1172)
## Changes

The approach to do this was:
1. Iterate over all libraries in all job tasks
2. Find references to local libraries
3. Store pointer to `compute.Library` in the matching artifact file to
signal it should be uploaded

This breaks down when introducing #1098 because we can no longer track
unexported state across mutators. The approach in this PR performs the
path matching twice; once in the matching mutator where we check if each
referenced file has an artifacts section, and once during artifact
upload to rewrite the library path from a local file reference to an
absolute Databricks path.

## Tests

Integration tests pass.
2024-02-05 15:29:45 +00:00
shreyas-goenka cb3ad737f1
Add short_name helper function to bundle init templates (#1167)
## Changes
Adds the short_name helper function. short_name is useful when templates
do not want to print the full userName (typically email or service
principal application-id) of the current user.

## Tests
Integration test. Also adds integration tests for other helper functions
that interact with the Databricks API.
2024-02-01 16:46:07 +00:00
Andrew Nester 0b3eeb8e54
Allow specifying executable in artifact section and skip bash from WSL (#1169)
## Changes
Allow specifying executable in artifact section

```
artifacts:
  test:
    type: whl
    executable: bash
    ...
```

We also skip bash found on Windows if it's from WSL because it won't be
correctly executed, see the issue above

Fixes #1159
2024-02-01 14:10:04 +00:00
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Ilia Babanov 9c3e4fda7c
Add "bundle summary" command (#1123)
The plan is to use the new command in the Databricks VSCode extension to
render "modified" UI state in the bundle resource tree elements, plus
use resource IDs to generate links for the resources

### New revision
- Renamed `remote-state` to `summary`
- Added "modified statuses" to all resources. Currently we don't set
"updated" status - it's either nothing, or created/deleted
- Added tests for the `TerraformToBundle` command
2024-01-25 11:32:47 +00:00
shreyas-goenka cf2a1c38ba
Set run_as permissions after variable interpolation (#1141)
## Changes

This PR sets run as permissions after variable interpolation.

Terraform does not allow specifying permissions for current user.

The following configuration would fail becuase we would assign a
permission block for self, bypassing this check here:
4ee926b885/bundle/config/mutator/run_as.go (L47)

```
run_as:
  user_name: ${workspace.current_user.userName}
```



## Tests
Manually, setting run_as to ${workspace.current_user.userName} works now
2024-01-24 12:22:04 +00:00
Andrew Nester 1b6241746e
Use MockWorkspaceClient from SDK instead of WithImpl mocking (#1134)
## Changes
Use MockWorkspaceClient from SDK instead of WithImpl mocking
2024-01-19 14:12:58 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Andrew Nester ef67b1755e
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes
Do not require positional arguments if they should be provided in JSON

Fixes #1122
2024-01-17 10:53:50 +00:00
Pieter Noordhuis 06b50670e1
Support passing job parameters to bundle run (#1115)
## Changes

This change adds support for job parameters. If job parameters are
specified for a job that doesn't define job parameters it returns an
error. Conversely, if task parameters are specified for a job that
defines job parameters, it also returns an error.

This change moves the options structs and their functions to separate
files and backfills test coverage for them.

Job parameters can now be specified with `--params foo=bar,bar=qux`.

## Tests

Unit tests and manual integration testing.
2024-01-15 07:42:36 +00:00
Pieter Noordhuis 3c76a11d00
Upgrade Go SDK to v0.29.0 (#1111)
## Changes

See:
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.29.0
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.28.0

## Tests

Unit and integration tests pass.
2024-01-11 08:16:25 +00:00
Pieter Noordhuis f5c46478f4
Upgrade golang.org/x/crypto to v0.17.0 in internal module (#1110)
## Changes

This addresses https://github.com/databricks/cli/security/dependabot/12.
2024-01-10 13:53:01 +00:00
Andrew Nester 4b01fff03d
Fixed instance pool resolving by name (#1102)
## Changes
Fixed instance pool resolving by name

## Tests
Added regression test
2024-01-05 10:50:53 +00:00
Andrew Nester 5fb40f9d07
Allow referencing bundle resources by name (#872)
## Changes
Now we can define variables with values which reference different
Databricks resources by name.
When references like this, DABs automatically looks up the resource by
this name and replaces the reference with ID of the resource referenced.
Thus when the variable is used in the configuration it will contain the
correct resolved ID of resource.

The resolvers are code generated and thus DABs support referencing all
resources which has `GetByName`-like methods in Go SDK.

### Example

```
variables:
  my_cluster_id:
    description: An existing cluster.
    lookup: 
      cluster: "12.2 shared"

resources:
  jobs:
    my_job:
      name: "My Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: ${var.my_cluster_id}

targets:
  dev:
    variables:
      my_cluster_id:
        lookup: 
           cluster: "dev-cluster"
```

## Tests
Added unit test + manual testing

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-01-04 21:04:42 +00:00
Lennart Kats (databricks) 167deec8c3
Change recommended production deployment path from /Shared to /Users (#1091)
## Changes

This PR changes the default and `mode: production` recommendation to
target `/Users` for deployment. Previously, we used `/Shared`, but
because of a lack of POSIX-like permissions in WorkspaceFS this meant
that files inside would be readable and writable by other users in the
workspace.

Detailed change:
* `default-python` no longer uses a path that starts with `/Shared`
* `mode: production` no longer requires a path that starts with
`/Shared`
 
## Related PRs

Docs: https://github.com/databricks/docs/pull/14585
Examples: https://github.com/databricks/bundle-examples/pull/17

## Tests

* Manual tests
* Template unit tests (with an extra check to avoid /Shared)
2024-01-02 19:58:24 +00:00
Lennart Kats (databricks) 9a1f078bd9
Improve error when bundle root is not writable (#1093)
## Changes

This improves the error when deploying to a bundle root that the current
user doesn't have write access to. This can come up slightly more often
since the change of https://github.com/databricks/cli/pull/1091.

Before this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: no such directory: /Users/lennart.kats@databricks.com/.bundle/my_project/prod/state
```

After this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: cannot write to deployment root (this can indicate a previous deploy was done with a different identity): /Users/lennart.kats@databricks.com/.bundle/my_project/prod
```

Note that this change uses the "no such directory" error returned from
the filer.
2023-12-28 13:15:21 +00:00
Pieter Noordhuis fa3c8b1017
Use resource key as name in permissions code (#1087)
## Changes

The code relied on the `Name` property being accessible for every
resource. This is generally true, but because these property structs are
embedded as pointer, they can be nil. This is also why the tests had to
initialize the embedded struct to pass. This changes the approach to use
the keys from the resource map instead, so that we no longer rely on the
non-nil embedded struct.

Note: we should evaluate whether we should turn these into values
instead of pointers. I don't recall if we get value from them being
pointers.

## Tests

Unit tests pass.
2023-12-22 14:45:53 +00:00
Andrew Nester ac37a592f1
Added exec.NewCommandExecutor to execute commands with correct interpreter (#1075)
## Changes
Instead of handling command chaining ourselves, we execute passed
commands as-is by storing them, in temp file and passing to correct
interpreter (bash or cmd) based on OS.

Fixes #1065 

## Tests
Added unit tests
2023-12-21 15:45:23 +00:00
Lennart Kats (databricks) 875c9d2db1
Tune output of bundle deploy command (#1047)
## Changes

Update the output of the `deploy` command to be more concise and
consistent:
```
$ databricks bundle deploy
Building my_project...
Uploading my_project-0.0.1+20231207.205106-py3-none-any.whl...
Uploading bundle files to /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

This does away with the intermediate success messages, makes consistent
use of `...`, and only prints the success message at the very end after
everything is completed.

Below is the original output for comparison:

```
$ databricks bundle deploy
Detecting Python wheel project...
Found Python wheel project at /tmp/output/my_project
Building my_project...
Build succeeded
Uploading my_project-0.0.1+20231207.205134-py3-none-any.whl...
Upload succeeded
Starting upload of bundle files
Uploaded bundle files at /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files!

Starting resource deployment
Resource deployment completed!
```
2023-12-21 08:00:37 +00:00
shreyas-goenka 2d93f62f21
Set metadata fields required to enable break-glass UI for jobs (#880)
## Changes

This PR sets the following fields for all jobs that are deployed from a
DAB
1. `deployment`: This provides the platform with the path to a file to
read the metadata from.
2. `edit_mode`: This tells the platform to display the break-glass UI
for jobs deployed from a DAB. Setting this is required to re-lock the UI
after a user clicks "disconnect from source".
3. `format = MULTI_TASK`. This makes the Terraform provider always use
jobs API 2.1 for creating/updating the job. Required because
`deployment` and `edit_mode` are only available in API 2.1.

## Tests

Unit test and manually. Manually verified that deployments trigger the
break glass UI. Manually verified there is no Terraform drift when all
three fields are set.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-12-19 07:38:52 +00:00
Pieter Noordhuis cee70a53c8
Test existing behavior when loading non-string spark conf values (#1071)
## Changes

This test is expected to fail when we enable the custom YAML loader.
2023-12-18 11:22:22 +00:00
Andrew Nester a6ec9ac08b
Upgrade Go SDK to 0.27.0 (#1064)
## Changes
Upgrade Go SDK to 0.27.0
2023-12-14 08:15:00 +00:00
Pieter Noordhuis 37671d9f54
Fix passthrough of pipeline notifications (#1058)
## Changes

Notifications weren't passed along because of a plural vs singular
mismatch.

## Tests

* Added unit test coverage.
* Manually confirmed it now works in an example bundle.
2023-12-12 11:36:06 +00:00
shreyas-goenka b479a7cf67
Upgrade Terraform schema version to v1.31.1 (#1055)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-12-11 21:53:21 +00:00
shreyas-goenka 6002f49c87
Move bundle schema update to an internal module (#1012)
## Changes

This PR:
1. Move code to load bundle JSON Schema descriptions from the OpenAPI
spec to an internal Go module
2. Remove command line flags from the `bundle schema` command. These
flags were meant for internal processes and at no point were meant for
customer use.
3. Regenerate `bundle_descriptions.json`
4. Add support for `bundle: "deprecated"`. The `environments` field is
tagged as deprecated in this PR and consequently will no longer be a
part of the bundle schema.

## Tests
Tested by regenerating the CLI against its current OpenAPI spec (as
defined in `__openapi_sha`). The `bundle_descriptions.json` in this PR
was generated from the code generator.

Manually checked that the autocompletion / descriptions from the new
bundle schema are correct.
2023-12-06 10:45:18 +00:00
Andrew Nester 83d50001fc
Pass parameters to task when run with `--python-params` and `python_wheel_wrapper` is true (#1037)
## Changes
It makes the behaviour consistent with or without `python_wheel_wrapper`
on when job is run with `--python-params` flag.

In `python_wheel_wrapper` mode it converts dynamic `python_params` in a
dynamic specially named `notebook_param` and the wrapper reads them with
`dbutils` and pass to `sys.argv`

Fixes #1000

## Tests
Added an integration test.

Integration tests pass.
2023-12-01 10:35:20 +00:00
shreyas-goenka 677926b78b
Fix panic when bundle auth resolution fails (#1002)
## Changes
CLI would panic if an invalid bundle auth is setup when running CLI
commands. This PR removes the panic and shows the error message directly
instead.

## Tests
The CWD is a bundle with:
```
workspace:
  profile: DEFAULT
```

Before:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
panic: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT

goroutine 1 [running]:
```

After:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
Error: cannot resolve bundle auth configuration: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT
```

```
shreyas.goenka@THW32HFW6T bundle-playground % DATABRICKS_CONFIG_FILE=/dev/null cli bundle deploy
Error:  cannot resolve bundle auth configuration: resolve: /dev/null has no DEFAULT profile configured. Config: profile=DEFAULT, config_file=/dev/null. Env: DATABRICKS_CONFIG_FILE
```
2023-11-30 14:28:01 +00:00
Andrew Nester 4d8d825746
Fixed panic when job has trigger and in development mode (#1026)
## Changes
Fixed panic when job has trigger and in development mode
2023-11-29 16:32:42 +00:00
Andrew Nester 833746cbdd
Do not replace pipeline libraries if there are no matches for pattern (#1021)
## Changes
If there are no matches when doing Glob call for pipeline library
defined, leave the entry as is.
The next mutators in the chain will detect that file is missing and the
error will be more user friendly.


Before the change

```
Starting resource deployment
Error: terraform apply: exit status 1

Error: cannot create pipeline: libraries must contain at least one element
```

After

```
Error: notebook ./non-existent not found
```


## Tests
Added regression unit tests
2023-11-29 13:20:13 +00:00
Andrew Nester 5431174302
Do not add wheel content hash in uploaded Python wheel path (#1015)
## Changes
Removed hash from the upload path since it's not useful anyway.

The main reason for that change was to make it work on all-purpose
clusters. But in order to make it work, wheel version needs to be
increased anyway. So having only hash in path is useless.

Note: using --build-number (build tag) flag does not help with
re-installing libraries on all-purpose clusters. The reason is that
`pip` ignoring build tag when upgrading the library and only look at
wheel version.
Build tag is only used for sorting the versions and the one with higher
build tag takes priority when installed. It only works if no library is
installed.
See
a15dd75d98/src/pip/_internal/index/package_finder.py (L522-L556)
https://github.com/pypa/pip/issues/4781

Thus, the only way to reinstall the library on all-purpose cluster is to
increase wheel version manually or use automatic version generation,
f.e.
```
setup(
  version=datetime.datetime.utcnow().strftime("%Y%m%d.%H%M%S"),
  ...
)
```

## Tests
Integration tests passed.
2023-11-29 10:40:12 +00:00
Pieter Noordhuis 6187803007
Correctly overwrite local state if remote state is newer (#1008)
## Changes

A bug in the code that pulls the remote state could cause the local
state to be empty instead of a copy of the remote state. This happened
only if the local state was present and stale when compared to the
remote version.

We correctly checked for the state serial to see if the local state had
to be replaced but didn't seek back on the remote state before writing
it out. Because the staleness check would read the remote state in full,
copying from the same reader would immediately yield an EOF.

## Tests

* Unit tests for state pull and push mutators that rely on a mocked
filer.
* An integration test that deploys the same bundle from multiple paths,
triggering the staleness logic.

Both failed prior to the fix and now pass.
2023-11-24 11:15:46 +00:00
Andrew Nester 48e293c72c
Pass `USERPROFILE` environment variable to Terraform (#1001)
## Changes
It appears that `USERPROFILE` env variable indicates where Azure CLI
stores configuration data (aka `.azure` folder).

https://learn.microsoft.com/en-us/cli/azure/azure-cli-configuration#cli-configuration-file

Passing it to terraform executable allows it to correctly authenticate
using Azure CLI.

Fixes #983 

## Tests
Ran deployment on Window VM before and after the fix.
2023-11-22 09:16:28 +00:00
Andrew Nester fa89db57e9
Enable `spark_jar_task` with local JAR libraries (#993)
## Changes
Previously local JAR paths were transformed to remote path during
initialisation and thus artifact building logic did not recognise such
libraries as local to be handled and uploaded.

Now it's possible to use spark_jar_tasks with local JAR libraries on
14.1+ DBR clusters

Example configuration
```
bundle:
  name: spark-jar

workspace:
  host: ***

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: "/Users/andrew.nester/dabs/wheel/sample-java/PrintArgs.jar"

resources:
  jobs:
    print_args:
      name: "Print Args"
      tasks:
        - task_key: Print
          new_cluster:
            num_workers: 0
            spark_version: 14.2.x-scala2.12
            node_type_id: i3.xlarge
            spark_conf:
              "spark.databricks.cluster.profile": "singleNode"
              "spark.master": "local[*]"
            custom_tags:
              ResourceClass: "SingleNode"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```

## Tests
Manually running `bundle deploy and bundle run`
2023-11-21 10:15:09 +00:00
Pieter Noordhuis 489d6fa1b8
Replace direct calls with `bundle.Apply` (#990)
## Changes

Some test call sites called directly into the mutator's `Apply` function
instead of `bundle.Apply`. Calling into `bundle.Apply` is preferred
because that's where we can run pre/post logic common across all
mutators.

## Tests

Pass.
2023-11-15 14:19:18 +00:00