mirror of https://github.com/databricks/cli.git
36 Commits
Author | SHA1 | Message | Date |
---|---|---|---|
Hector Castejon Diaz |
c83ade86af
|
WIP | |
dependabot[bot] |
f8bb3a8d72
|
Bump github.com/databricks/databricks-sdk-go from 0.48.0 to 0.49.0 (#1843)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.48.0 to 0.49.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.49.0</h2>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DisableLegacyDbfsAPI">w.DisableLegacyDbfs</a>
workspace-level service.</li>
<li>Added <code>UnityCatalogProvisioningState</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsTruncated</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Result">dashboards.Result</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Report</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <code>SequenceBy</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig">pipelines.TableSpecificConfig</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#Alert">sql.Alert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateAlertRequestAlert">sql.CreateAlertRequestAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListAlertsResponseAlert">sql.ListAlertsResponseAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#UpdateAlertRequestAlert">sql.UpdateAlertRequestAlert</a>.</li>
</ul>
<p>OpenAPI SHA: cf9c61453990df0f9453670f2fe68e1b128647a2, Date:
2024-10-14</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.49.0</h2>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DisableLegacyDbfsAPI">w.DisableLegacyDbfs</a>
workspace-level service.</li>
<li>Added <code>UnityCatalogProvisioningState</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsTruncated</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Result">dashboards.Result</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Report</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <code>SequenceBy</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig">pipelines.TableSpecificConfig</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#Alert">sql.Alert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateAlertRequestAlert">sql.CreateAlertRequestAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListAlertsResponseAlert">sql.ListAlertsResponseAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#UpdateAlertRequestAlert">sql.UpdateAlertRequestAlert</a>.</li>
</ul>
<p>OpenAPI SHA: cf9c61453990df0f9453670f2fe68e1b128647a2, Date:
2024-10-14</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="
|
|
Andrew Nester |
f0e2981596
|
Added JSON input validation for CLI commands (#1771)
## Changes Added JSON input validation for CLI commands. Now when invalid JSON passed as a payload to CLI commands, CLI performs input normalisation and detects if there are any mismatches such as incorrect types, unknown fields and etc. This diagnostic information is printed in standard error output and does not block command execution, so the change is backward compatible. Fixes #1769 #1764 #1625 #1560 ## Tests Added unit tests ``` andrew.nester@HFW9Y94129 ~ % databricks jobs create --json '{"seeti}' Error: error decoding JSON at (inline):1:2: unexpected EOF andrew.nester@HFW9Y94129 ~ % databricks jobs create --json '{"seeti": true}' Warning: unknown field: seeti in (inline):1:9 Error: Job settings must be specified. ``` --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com> |
|
Andrew Nester |
54799a1918
|
Upgrade Go SDK to 0.44.0 (#1679)
## Changes Upgrade Go SDK to 0.44.0 --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com> |
|
dependabot[bot] |
8468878eed
|
Bump github.com/databricks/databricks-sdk-go from 0.42.0 to 0.43.0 (#1522)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.42.0 to 0.43.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.43.0</h2> <p>Major Changes and Improvements:</p> <ul> <li>Support partners in user agent for SDK (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li> <li>Add <code>serverless_compute_id</code> field to the config (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li> </ul> <p>Other Changes:</p> <ul> <li>Generate from latest spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>) and (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>MaxResults</code> and <code>PageToken</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li> <li>Added <code>NextPageToken</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li> <li>Added <code>TableServingUrl</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li> <li>Changed <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>, <code>DeleteSchedule</code>, <code>DeleteSubscription</code>, <code>GetSchedule</code>, <code>GetSubscription</code>, <code>List</code>, <code>ListSchedules</code>, <code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a> structs.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>EnvironmentKey</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Removed <code>ConditionTask</code>, <code>DbtTask</code>, <code>NotebookTask</code>, <code>PipelineTask</code>, <code>PythonWheelTask</code>, <code>RunJobTask</code>, <code>SparkJarTask</code>, <code>SparkPythonTask</code>, <code>SparkSubmitTask</code>, <code>SqlTask</code> and <code>Environments</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>Periodic</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li> <li>Added <code>ProviderSummary</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li> <li>Added <code>Start</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a> workspace-level service.</li> <li>Added <code>ServicePrincipalId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <code>ServicePrincipalName</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li> <li>Added <code>QueryNextPage</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a> workspace-level service.</li> <li>Added <code>QueryType</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li> <li>Added <code>NextPageToken</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li> </ul> <p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-25</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.43.0</h2> <p>Major Changes and Improvements:</p> <ul> <li>Support partners in user agent for SDK (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li> <li>Add <code>serverless_compute_id</code> field to the config (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li> </ul> <p>Other Changes:</p> <ul> <li>Generate from latest spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>) and (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>MaxResults</code> and <code>PageToken</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li> <li>Added <code>NextPageToken</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li> <li>Added <code>TableServingUrl</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li> <li>Changed <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>IsolationMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li> <li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>, <code>DeleteSchedule</code>, <code>DeleteSubscription</code>, <code>GetSchedule</code>, <code>GetSubscription</code>, <code>List</code>, <code>ListSchedules</code>, <code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a> structs.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>EnvironmentKey</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Removed <code>ConditionTask</code>, <code>DbtTask</code>, <code>NotebookTask</code>, <code>PipelineTask</code>, <code>PythonWheelTask</code>, <code>RunJobTask</code>, <code>SparkJarTask</code>, <code>SparkPythonTask</code>, <code>SparkSubmitTask</code>, <code>SqlTask</code> and <code>Environments</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>Periodic</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li> <li>Added <code>OnStreamingBacklogExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li> <li>Added <code>ProviderSummary</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li> <li>Added <code>Start</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a> workspace-level service.</li> <li>Added <code>ServicePrincipalId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <code>ServicePrincipalName</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li> <li>Added <code>QueryNextPage</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a> workspace-level service.</li> <li>Added <code>QueryType</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li> <li>Added <code>NextPageToken</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li> </ul> <p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-25</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
dependabot[bot] |
781688c9cb
|
Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 (#1405)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.38.0 to 0.39.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.39.0</h2> <h2>0.39.0</h2> <ul> <li>Ignored flaky integration tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li> <li>Added retries for "worker env WorkerEnvId(workerenv-XXXXX) not found" (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li> <li>Updated SDK to OpenAPI spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li> </ul> <p>Note: This release contains breaking changes, please see the API changes below for more details.</p> <p>API Changes:</p> <ul> <li>Added <code>IngestionDefinition</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <code>Deployment</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> <li>Added <code>WarehouseId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li> <li>Added <code>RunAs</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li> <li>Added <code>GetOpenApi</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li> <li>Added <code>SchemaId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li> <li>Added <code>Operation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li> <li>Added <code>Requirements</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li> <li>Removed <code>AwsOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>AzureOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>GcpOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Changed <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> </ul> <p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date: 2024-04-23</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.39.0</h2> <ul> <li>Ignored flaky integration tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li> <li>Added retries for "worker env WorkerEnvId(workerenv-XXXXX) not found" (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li> <li>Updated SDK to OpenAPI spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li> </ul> <p>Note: This release contains breaking changes, please see the API changes below for more details.</p> <p>API Changes:</p> <ul> <li>Added <code>IngestionDefinition</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <code>Deployment</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> <li>Added <code>WarehouseId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li> <li>Added <code>RunAs</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li> <li>Added <code>GetOpenApi</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li> <li>Added <code>SchemaId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li> <li>Added <code>Operation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li> <li>Added <code>Requirements</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li> <li>Removed <code>AwsOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>AzureOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>GcpOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Changed <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> </ul> <p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date: 2024-04-23</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
dependabot[bot] |
f28a9d7107
|
Bump github.com/databricks/databricks-sdk-go from 0.36.0 to 0.37.0 (#1326)
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.36.0&new-version=0.37.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com> |
|
Andrew Nester |
c7818560ca
|
Add usage string when command fails with incorrect arguments (#1276)
## Changes Add usage string when command fails with incorrect arguments Fixes #1119 ## Tests Example output ``` > databricks libraries cluster-status Error: accepts 1 arg(s), received 0 Usage: databricks libraries cluster-status CLUSTER_ID [flags] Flags: -h, --help help for cluster-status Global Flags: --debug enable debug logging -o, --output type output type: text or json (default text) -p, --profile string ~/.databrickscfg profile -t, --target string bundle target to use (if applicable) ``` |
|
Pieter Noordhuis |
74b1e05ed7
|
Update Go SDK to v0.34.0 (#1256)
## Changes SDK release https://github.com/databricks/databricks-sdk-go/releases/tag/v0.34.0 This incorporates two changes to the generation code: * Use explicit empty check for response types (see https://github.com/databricks/databricks-sdk-go/pull/831) * Support subservices for the settings commands (see https://github.com/databricks/databricks-sdk-go/pull/826) As part of the subservices support, this change also updates how methods are registered with their services. This used to be done with `init` functions and now through inline function calls. This should have a (negligible) positive impact on binary start time because we no longer have to call as many `init` functions. ## Tests tbd |
|
Miles Yucht |
b65ce75c1f
|
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes Currently, when the CLI run a list API call (like list jobs), it uses the `List*All` methods from the SDK, which list all resources in the collection. This is very slow for large collections: if you need to list all jobs from a workspace that has 10,000+ jobs, you'll be waiting for at least 100 RPCs to complete before seeing any output. Instead of using List*All() methods, the SDK recently added an iterator data structure that allows traversing the collection without needing to completely list it first. New pages are fetched lazily if the next requested item belongs to the next page. Using the List() methods that return these iterators, the CLI can proactively print out some of the response before the complete collection has been fetched. This involves a pretty major rewrite of the rendering logic in `cmdio`. The idea there is to define custom rendering logic based on the type of the provided resource. There are three renderer interfaces: 1. textRenderer: supports printing something in a textual format (i.e. not JSON, and not templated). 2. jsonRenderer: supports printing something in a pretty-printed JSON format. 3. templateRenderer: supports printing something using a text template. There are also three renderer implementations: 1. readerRenderer: supports printing a reader. This only implements the textRenderer interface. 2. iteratorRenderer: supports printing a `listing.Iterator` from the Go SDK. This implements jsonRenderer and templateRenderer, buffering 20 resources at a time before writing them to the output. 3. defaultRenderer: supports printing arbitrary resources (the previous implementation). Callers will either use `cmdio.Render()` for rendering individual resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an iterator. This separate method is needed to safely be able to match on the type of the iterator, since Go does not allow runtime type matches on generic types with an existential type parameter. One other change that needs to happen is to split the templates used for text representation of list resources into a header template and a row template. The template is now executed multiple times for List API calls, but the header should only be printed once. To support this, I have added `headerTemplate` to `cmdIO`, and I have also changed `RenderWithTemplate` to include a `headerTemplate` parameter everywhere. ## Tests - [x] Unit tests for text rendering logic - [x] Unit test for reflection-based iterator construction. --------- Co-authored-by: Andrew Nester <andrew.nester@databricks.com> |
|
Andrew Nester |
ef67b1755e
|
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes Do not require positional arguments if they should be provided in JSON Fixes #1122 |
|
Pieter Noordhuis |
3c76a11d00
|
Upgrade Go SDK to v0.29.0 (#1111)
## Changes See: * https://github.com/databricks/databricks-sdk-go/releases/tag/v0.29.0 * https://github.com/databricks/databricks-sdk-go/releases/tag/v0.28.0 ## Tests Unit and integration tests pass. |
|
shreyas-goenka |
a6752a5388
|
Add list of supported values for flags that represent an enum field (#1036)
## Changes This PR adds the list of supported values for flags that represent an enum field in the flag's documentation. |
|
shreyas-goenka |
76840176e3
|
Add documentation for positional args in commands generated from the Databricks OpenAPI specification (#1033)
## Changes This PR adds documentation for positional arguments in commands that are generated from the openapi spec. Note: the changes to `.gitattributes` will be revert / properly fixed in https://github.com/databricks/cli/pull/1012 |
|
dependabot[bot] |
c3ced68c60
|
Bump github.com/databricks/databricks-sdk-go from 0.24.0 to 0.25.0 (#980)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.24.0 to 0.25.0. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.25.0</h2> <ul> <li>Make sure path parameters are first in order in RequiredFields (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/669">#669</a>).</li> <li>Added Field.IsRequestBodyField method for code generation (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/670">#670</a>).</li> <li>Added regressions question to the issue template (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/676">#676</a>).</li> <li>Added telemetry for CI/CD platform to useragent (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/665">#665</a>).</li> <li>Skiped GCP Integration Tests using Statement Execution API (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/678">#678</a>).</li> <li>Added more detailed error message on default credentials not found error (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/679">#679</a>).</li> <li>Updated SDK to latest OpenAPI Spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/685">#685</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionsAPI">w.Functions</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service with new required argument order.</li> <li>Changed <code>InputParams</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li> <li>Changed <code>Properties</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <code>string</code>.</li> <li>Changed <code>ReturnParams</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a></li> <li>Changed <code>StorageRoot</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateMetastore">catalog.CreateMetastore</a> to no longer be required.</li> <li>Added <code>SkipValidation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>Libraries</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#CreatePolicy">compute.CreatePolicy</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EditPolicy">compute.EditPolicy</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Policy">compute.Policy</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptEventDetails">compute.InitScriptEventDetails</a>.</li> <li>Added <code>InitScripts</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventDetails">compute.EventDetails</a>.</li> <li>Added <code>File</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfo">compute.InitScriptInfo</a>.</li> <li>Added <code>ZoneId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InstancePoolGcpAttributes">compute.InstancePoolGcpAttributes</a>.</li> <li>Added <code>IncludeResolvedValues</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetRunRequest">jobs.GetRunRequest</a>.</li> <li>Added <code>EditMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>NetworkConnectivityConfigId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/provisioning#UpdateWorkspaceRequest">provisioning.UpdateWorkspaceRequest</a>.</li> <li>Added <code>ContainerLogs</code> and <code>ExtraInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunctionRequest">catalog.CreateFunctionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DependencyList">catalog.DependencyList</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetails">compute.InitScriptExecutionDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetailsStatus">compute.InitScriptExecutionDetailsStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfoAndExecutionDetails">compute.InitScriptInfoAndExecutionDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LocalFileInfo">compute.LocalFileInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobEditMode">jobs.CreateJobEditMode</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsEditMode">jobs.JobSettingsEditMode</a>.</li> <li>Added <code>DeleteApp</code>, <code>GetApp</code>, <code>GetAppDeploymentStatus</code>, <code>GetApps</code> and <code>GetEvents</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityAPI">a.NetworkConnectivity</a> account-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreateNetworkConnectivityConfigRequest">settings.CreateNetworkConnectivityConfigRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequest">settings.CreatePrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequestGroupId">settings.CreatePrivateEndpointRuleRequestGroupId</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeleteNetworkConnectivityConfigurationRequest">settings.DeleteNetworkConnectivityConfigurationRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeletePrivateEndpointRuleRequest">settings.DeletePrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetNetworkConnectivityConfigurationRequest">settings.GetNetworkConnectivityConfigurationRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetPrivateEndpointRuleRequest">settings.GetPrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRule">settings.NccAzurePrivateEndpointRule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleConnectionState">settings.NccAzurePrivateEndpointRuleConnectionState</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleGroupId">settings.NccAzurePrivateEndpointRuleGroupId</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzureServiceEndpointRule">settings.NccAzureServiceEndpointRule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressConfig">settings.NccEgressConfig</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressDefaultRules">settings.NccEgressDefaultRules</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressTargetRules">settings.NccEgressTargetRules</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityConfiguration">settings.NetworkConnectivityConfiguration</a>.</li> <li>Removed <code>Delete</code>, <code>Get</code>, method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsUiState">jobs.JobSettingsUiState</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobUiState">jobs.CreateJobUiState</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentAPI">a.OAuthEnrollment</a> account-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateOAuthEnrollment">oauth2.CreateOAuthEnrollment</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentStatus">oauth2.OAuthEnrollmentStatus</a>.</li> <li>Removed <code>UiState</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> </ul> <p>OpenAPI SHA: e7b127cb07af8dd4d8c61c7cc045c8910cdbb02a, Date: 2023-11-08 Dependency updates:</p> <ul> <li>Bump google.golang.org/api from 0.146.0 to 0.150.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/683">#683</a>).</li> <li>Bump golang.org/x/mod from 0.13.0 to 0.14.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/681">#681</a>).</li> <li>Bump google.golang.org/grpc from 1.58.2 to 1.58.3 in /examples/slog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/672">#672</a>).</li> <li>Bump google.golang.org/grpc to 1.58.3 in /examples/zerolog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/684">#684</a>).</li> <li>Bump golang.org/x/time from 0.3.0 to 0.4.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/680">#680</a>).</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
Andrew Nester |
d768994bbf
|
Simplified code generation logic for handling path and request body parameters and JSON input (#905)
## Changes Simplified code generation logic for handling path and request body parameters and JSON input Note: relies on these PRs: https://github.com/databricks/databricks-sdk-go/pull/666 https://github.com/databricks/databricks-sdk-go/pull/669 https://github.com/databricks/databricks-sdk-go/pull/670 |
|
hectorcast-db |
36f30c8b47
|
Update Go SDK to 0.23.0 and use custom marshaller (#772)
## Changes Update Go SDK to 0.23.0 and use custom marshaller. ## Tests * Run unit tests * Run nightly * Manual test: ``` ./cli jobs create --json @myjob.json ``` with ``` { "name": "my-job-marshal-test-go", "tasks": [{ "task_key": "testgomarshaltask", "new_cluster": { "num_workers": 0, "spark_version": "10.4.x-scala2.12", "node_type_id": "Standard_DS3_v2" }, "libraries": [ { "jar": "dbfs:/max/jars/exampleJarTask.jar" } ], "spark_jar_task": { "main_class_name": "com.databricks.quickstart.exampleTask" } }] } ``` Main branch: ``` Error: Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size ``` This branch: ``` { "job_id":<jobid> } ``` --------- Co-authored-by: Miles Yucht <miles@databricks.com> |
|
Andrew Nester |
e1d1e95525
|
Updated Go SDK to 0.22.0 (#831)
## Changes Updated Go SDK to 0.22.0 |
|
Pieter Noordhuis |
1752e29885
|
Update Go SDK to v0.19.0 (#729)
## Changes * Update Go SDK to v0.19.0 * Update commands per OpenAPI spec from Go SDK * Incorporate `client.Do()` signature change to include a (nil) header map * Update `workspace.WorkspaceService` mock with permissions methods * Skip `files` service in codegen; already implemented under the `fs` command ## Tests Unit and integration tests pass. |
|
Miles Yucht |
3697dfcb51
|
Release v0.202.0 (#619)
Breaking Change: * Require include glob patterns to be explicitly defined ([#602](https://github.com/databricks/cli/pull/602)). Bundles: * Add support for more SDK config options ([#587](https://github.com/databricks/cli/pull/587)). * Add template renderer for Databricks templates ([#589](https://github.com/databricks/cli/pull/589)). * Fix formatting in renderer.go ([#593](https://github.com/databricks/cli/pull/593)). * Fixed python wheel test ([#608](https://github.com/databricks/cli/pull/608)). * Auto detect Python wheel packages and infer build command ([#603](https://github.com/databricks/cli/pull/603)). * Added support for artifacts building for bundles ([#583](https://github.com/databricks/cli/pull/583)). * Add support for cloning repositories ([#544](https://github.com/databricks/cli/pull/544)). * Add regexp compile helper function for templates ([#601](https://github.com/databricks/cli/pull/601)). * Add unit test that raw strings are printed as is ([#599](https://github.com/databricks/cli/pull/599)). Internal: * Fix tests under ./cmd/configure if DATABRICKS_TOKEN is set ([#605](https://github.com/databricks/cli/pull/605)). * Remove dependency on global state in generated commands ([#595](https://github.com/databricks/cli/pull/595)). * Remove dependency on global state for the root command ([#606](https://github.com/databricks/cli/pull/606)). * Add merge_group trigger for build ([#612](https://github.com/databricks/cli/pull/612)). * Added support for build command chaining and error on missing wheel ([#607](https://github.com/databricks/cli/pull/607)). * Add TestAcc prefix to filer test and fix any failing tests ([#611](https://github.com/databricks/cli/pull/611)). * Add url parse helper function for templates ([#600](https://github.com/databricks/cli/pull/600)). * Remove dependency on global state for remaining commands ([#613](https://github.com/databricks/cli/pull/613)). * Update CHANGELOG template ([#588](https://github.com/databricks/cli/pull/588)). |
|
Pieter Noordhuis |
3fa400f00f
|
Remove dependency on global state in generated commands (#595)
## Changes Generated commands relied on global variables for flags and request payloads. This is difficult to test if a sequence of tests tries to run the same command with various arguments because the global state causes test interference. Moreover, it is impossible to run tests in parallel. This change modifies the approach and turns every command group and command itself into a function that returns a `*cobra.Command`. All flags and request payloads are variables scoped to the command's initialization function. This means it is possible to construct independent copies of the CLI structure and fixes the test isolation issue. The scope of this change is only the generated commands. The other commands will be changed accordingly in subsequent changes. ## Tests Unit and integration tests pass. |
|
Serge Smertin |
acf292da37
|
Release v0.201.0 (#586)
* Add development runs ([#522](https://github.com/databricks/cli/pull/522)). * Support tab completion for profiles ([#572](https://github.com/databricks/cli/pull/572)). * Correctly use --profile flag passed for all bundle commands ([#571](https://github.com/databricks/cli/pull/571)). * Disallow notebooks in paths where files are expected ([#573](https://github.com/databricks/cli/pull/573)). * Improve auth login experience ([#570](https://github.com/databricks/cli/pull/570)). * Remove base path checks during sync ([#576](https://github.com/databricks/cli/pull/576)). * First look for databricks.yml before falling back to bundle.yml ([#580](https://github.com/databricks/cli/pull/580)). * Integrate with auto-release infra ([#581](https://github.com/databricks/cli/pull/581)). API Changes: * Removed `databricks metastores maintenance` command. * Added `databricks metastores enable-optimization` command. * Added `databricks tables update` command. * Changed `databricks account settings delete-personal-compute-setting` command with new required argument order. * Changed `databricks account settings read-personal-compute-setting` command with new required argument order. * Added `databricks clean-rooms` command group. OpenAPI SHA: 850a075ed9758d21a6bc4409506b48c8b9f93ab4, Date: 2023-07-18 Dependency updates: * Bump golang.org/x/term from 0.9.0 to 0.10.0 ([#567](https://github.com/databricks/cli/pull/567)). * Bump golang.org/x/oauth2 from 0.9.0 to 0.10.0 ([#566](https://github.com/databricks/cli/pull/566)). * Bump golang.org/x/mod from 0.11.0 to 0.12.0 ([#568](https://github.com/databricks/cli/pull/568)). * Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 ([#585](https://github.com/databricks/cli/pull/585)). |
|
Andrew Nester |
17ed7a317b
|
Fixed ignoring required positional parameters when --json flag is provided (#535)
## Changes When there are positional required parameters in the command which can't be unmarshalled from JSON, we should require them despite the fact `--json` flag is provided. The reason is that for some of the command, for example, `databricks groups patch ID` these arguments are actually path arguments in API and can't be set as part of `--json` body provided. Original change which introduced this ignore logic is here: https://github.com/databricks/cli/pull/405 Fixes https://github.com/databricks/cli/issues/533, https://github.com/databricks/cli/issues/537 Note: Code generation is based on the change in this PR: https://github.com/databricks/databricks-sdk-go/pull/536 ## Tests 1. Running `cli groups patch 123 --json {...}` works correctly Backward compatibility tests with previous changes from https://github.com/databricks/cli/pull/405 1. `cli clusters events --json '{"cluster_id": "1029-xxxx"}'` - works, returns list of events 2. `cli clusters events 1029-xxxx` - works, returns list of events 3. `cli clusters events` - works, first prompts for Cluster ID and then returns the list of events |
|
Pieter Noordhuis |
ad8183d7a9
|
Bump Go SDK to v0.12.0 (#540)
## Changes * Regenerate CLI commands * Ignore `account-access-control-proxy` (see #505) ## Tests Unit and integration tests pass. |
|
Andrew Nester |
d11128c638
|
Fixed jobs create command to only accept JSON payload (#498)
## Changes Fixed jobs create command to only accept JSON payload. Note: relies on this PR from Go SDK https://github.com/databricks/databricks-sdk-go/pull/522 ## Tests ``` andrew.nester@HFW9Y94129 cli % ./cli jobs create -h Create a new job. Create a new job. Usage: databricks jobs create [flags] Flags: -h, --help help for create --json JSON either inline JSON string or @path/to/file.json with request body (default JSON (0 bytes)) Global Flags: -e, --environment string bundle environment to use (if applicable) --log-file file file to write logs to (default stderr) --log-format type log output format (text or json) (default text) --log-level format log level (default disabled) -o, --output type output type: text or json (default text) -p, --profile string ~/.databrickscfg profile --progress-format format format for progress logs (append, inplace, json) (default default) ``` |
|
Andrew Nester |
9cf0e0db24
|
Correctly set ExactArgs if generated command has positional arguments (#488)
## Changes Some of the command such as `databricks alerts create` require positional arguments which are not primitive. Since these arguments are required, we should correctly set ExactArgs for such commands Fixes #367 ## Tests Running `databricks alerts create` Before ``` andrew.nester@HFW9Y94129 cli % ./cli alerts create panic: runtime error: index out of range [0] with length 0 goroutine 1 [running]: github.com/databricks/bricks/cmd/workspace/alerts.glob..func1(0x22a1280?, {0x2321638, 0x0, 0x0?}) github.com/databricks/bricks/cmd/workspace/alerts/alerts.go:57 +0x355 github.com/spf13/cobra.(*Command).execute(0x22a1280, {0x2321638, 0x0, 0x0}) github.com/spf13/cobra@v1.7.0/command.go:940 +0x862 github.com/spf13/cobra.(*Command).ExecuteC(0x22a0700) github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3bd github.com/spf13/cobra.(*Command).ExecuteContextC(...) github.com/spf13/cobra@v1.7.0/command.go:1001 github.com/databricks/bricks/cmd/root.Execute() github.com/databricks/bricks/cmd/root/root.go:80 +0x6a main.main() github.com/databricks/bricks/main.go:18 +0x17 ``` After ``` andrew.nester@HFW9Y94129 cli % ./cli alerts create Error: provide command input in JSON format by specifying --json option ``` Acceptance test ``` === RUN TestAccAlertsCreateErrWhenNoArguments alerts_test.go:10: gcp helpers.go:147: Error running command: provide command input in JSON format by specifying --json option --- PASS: TestAccAlertsCreateErrWhenNoArguments (1.99s) PASS ``` |
|
Pieter Noordhuis |
dd92b3f989
|
Disable shell completions for generated commands (#483)
## Changes Disable shell completions for generated commands. Default completion behavior completes local files which never makes sense. Automatic contextual completion of required arguments would be super powerful but a lot of work to get right. Until then, we could do manual completion functions in `overrides.go` as needed. This fixes #374. ## Tests Confirmed manually that commands no longer complete local files. |
|
Pieter Noordhuis |
f219a0da5a
|
Annotate generated commands with OpenAPI package name (#466)
Co-authored-by: Serge Smertin <259697+nfx@users.noreply.github.com> |
|
Serge Smertin |
2aa61a7c1b
|
Update with the latest Go SDK (#457)
## Changes - removed deprecated methods - regenerated with the latest OpenAPI spec - picked up the latest go SDK version ## Tests `make test` |
|
Andrew Nester |
1f130f3722
|
Do not use FgWhite and FgBlack for terminal output (#435)
## Changes Using white / black color for terminal output will lead to poorly displayed content in either light or dark terminal backgrounds. Some other CLIs experienced same issues (https://github.com/qri-io/qri/pull/774) Instead, let's just use color to highlight some of the output so it's more compatible with different background styles ## Tests <img width="772" alt="Screenshot 2023-06-05 at 16 05 09" src="https://github.com/databricks/cli/assets/2969996/01790239-6a33-4059-86a8-d5117ea0b75f"> --- <img width="757" alt="Screenshot 2023-06-05 at 16 05 20" src="https://github.com/databricks/cli/assets/2969996/ea3b9fdc-3782-4f4f-a9df-19e66af0c04f"> |
|
Andrew Nester |
3dbf7a575a
|
Better error message if can not load prompts (#437)
## Changes Better error message if can not load prompts ## Tests Setup 2 jobs with the same name and ran `cli job get` ``` andrew.nester@HFW9Y94129 multiples-tasks % ../../cli/cli jobs get Error: failed to load names for Jobs drop-down. Please manually specify required arguments. Original error: duplicate .Settings.Name: duplicatejob ``` |
|
Andrew Nester |
f8255f356b
|
Added spinner when loading command prompts (#420)
## Changes Added spinner when loading command prompts ## Tests ``` andrew.nester@HFW9Y94129 multiples-tasks % cli jobs get-run ⡿ No RUN_ID argument specified. Loading names for Jobs drop-down.``` |
|
Andrew Nester |
3c4d6f637f
|
Changed service template to correctly handle required positional arguments (#405) | |
Pieter Noordhuis |
46df551816
|
Update to Go SDK v0.9.0 (#396)
## Changes See https://github.com/databricks/databricks-sdk-go/releases/tag/v0.9.0. ## Tests Ran integration tests manually. |
|
Pieter Noordhuis |
98ebb78c9b
|
Rename bricks -> databricks (#389)
## Changes Rename all instances of "bricks" to "databricks". ## Tests * Confirmed the goreleaser build works, uses the correct new binary name, and produces the right archives. * Help output is confirmed to be correct. * Output of `git grep -w bricks` is minimal with a couple changes remaining for after the repository rename. |
|
Serge Smertin |
4c4a293015
|
Added OpenAPI command coverage (#357)
This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account. |