Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 (#2126)

Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.54.0 to 0.55.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b83a7262d5"><code>b83a726</code></a>
[Release] Release v0.55.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1117">#1117</a>)</li>
<li><a
href="23d9c1ea2b"><code>23d9c1e</code></a>
[Internal] Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1115">#1115</a>)</li>
<li><a
href="adc94cabf7"><code>adc94ca</code></a>
[Internal] Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1110">#1110</a>)</li>
<li><a
href="83db3cbdab"><code>83db3cb</code></a>
[Internal] Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1114">#1114</a>)</li>
<li><a
href="2b55375727"><code>2b55375</code></a>
[Internal] Migrate workflows that need write access to use hosted
runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1">#1</a>...</li>
<li><a
href="03fb2681fa"><code>03fb268</code></a>
[Internal] Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1107">#1107</a>)</li>
<li><a
href="28e1a698ab"><code>28e1a69</code></a>
[Internal] Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1106">#1106</a>)</li>
<li><a
href="2399d721fe"><code>2399d72</code></a>
[Internal] Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1104">#1104</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.54.0...v0.55.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.54.0&new-version=0.55.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
This commit is contained in:
dependabot[bot] 2025-01-14 16:02:34 +00:00 committed by GitHub
parent fca6abdfac
commit 72e677d0ac
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
11 changed files with 411 additions and 88 deletions

View File

@ -1 +1 @@
a6a317df8327c9b1e5cb59a03a42ffa2aabeef6d 779817ed8d63031f5ea761fbd25ee84f38feec0d

View File

@ -1,4 +1,47 @@
# This file is auto-generated. DO NOT EDIT. # This file is auto-generated. DO NOT EDIT.
github.com/databricks/cli/bundle/config/resources.App:
"active_deployment":
"description": |-
The active deployment of the app. A deployment is considered active when it has been deployed
to the app compute.
"app_status": {}
"compute_status": {}
"create_time":
"description": |-
The creation time of the app. Formatted timestamp in ISO 6801.
"creator":
"description": |-
The email of the user that created the app.
"default_source_code_path":
"description": |-
The default workspace file system path of the source code from which app deployment are
created. This field tracks the workspace source code path of the last active deployment.
"description":
"description": |-
The description of the app.
"name":
"description": |-
The name of the app. The name must contain only lowercase alphanumeric characters and hyphens.
It must be unique within the workspace.
"pending_deployment":
"description": |-
The pending deployment of the app. A deployment is considered pending when it is being prepared
for deployment to the app compute.
"resources":
"description": |-
Resources for the app.
"service_principal_client_id": {}
"service_principal_id": {}
"service_principal_name": {}
"update_time":
"description": |-
The update time of the app. Formatted timestamp in ISO 6801.
"updater":
"description": |-
The email of the user that last updated the app.
"url":
"description": |-
The URL of the app once it is deployed.
github.com/databricks/cli/bundle/config/resources.Cluster: github.com/databricks/cli/bundle/config/resources.Cluster:
"apply_policy_default_values": "apply_policy_default_values":
"description": |- "description": |-
@ -220,6 +263,7 @@ github.com/databricks/cli/bundle/config/resources.Job:
"job_clusters": "job_clusters":
"description": |- "description": |-
A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings. A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.
If more than 100 job clusters are available, you can paginate through them using :method:jobs/get.
"max_concurrent_runs": "max_concurrent_runs":
"description": |- "description": |-
An optional maximum allowed number of concurrent runs of the job. An optional maximum allowed number of concurrent runs of the job.
@ -250,6 +294,7 @@ github.com/databricks/cli/bundle/config/resources.Job:
"tasks": "tasks":
"description": |- "description": |-
A list of task specifications to be executed by this job. A list of task specifications to be executed by this job.
If more than 100 tasks are available, you can paginate through them using :method:jobs/get. Use the `next_page_token` field at the object root to determine if more results are available.
"timeout_seconds": "timeout_seconds":
"description": |- "description": |-
An optional timeout applied to each run of this job. A value of `0` means no timeout. An optional timeout applied to each run of this job. A value of `0` means no timeout.
@ -489,6 +534,187 @@ github.com/databricks/cli/bundle/config/resources.Volume:
"description": |- "description": |-
The storage location on the cloud The storage location on the cloud
"volume_type": {} "volume_type": {}
github.com/databricks/databricks-sdk-go/service/apps.AppDeployment:
"create_time":
"description": |-
The creation time of the deployment. Formatted timestamp in ISO 6801.
"creator":
"description": |-
The email of the user creates the deployment.
"deployment_artifacts":
"description": |-
The deployment artifacts for an app.
"deployment_id":
"description": |-
The unique id of the deployment.
"mode":
"description": |-
The mode of which the deployment will manage the source code.
"source_code_path":
"description": |-
The workspace file system path of the source code used to create the app deployment. This is different from
`deployment_artifacts.source_code_path`, which is the path used by the deployed app. The former refers
to the original source code location of the app in the workspace during deployment creation, whereas
the latter provides a system generated stable snapshotted source code path used by the deployment.
"status":
"description": |-
Status and status message of the deployment
"update_time":
"description": |-
The update time of the deployment. Formatted timestamp in ISO 6801.
github.com/databricks/databricks-sdk-go/service/apps.AppDeploymentArtifacts:
"source_code_path":
"description": |-
The snapshotted workspace file system path of the source code loaded by the deployed app.
github.com/databricks/databricks-sdk-go/service/apps.AppDeploymentMode:
"_":
"enum":
- |-
SNAPSHOT
- |-
AUTO_SYNC
github.com/databricks/databricks-sdk-go/service/apps.AppDeploymentState:
"_":
"enum":
- |-
SUCCEEDED
- |-
FAILED
- |-
IN_PROGRESS
- |-
CANCELLED
github.com/databricks/databricks-sdk-go/service/apps.AppDeploymentStatus:
"message":
"description": |-
Message corresponding with the deployment state.
"state":
"description": |-
State of the deployment.
github.com/databricks/databricks-sdk-go/service/apps.AppResource:
"description":
"description": |-
Description of the App Resource.
"job": {}
"name":
"description": |-
Name of the App Resource.
"secret": {}
"serving_endpoint": {}
"sql_warehouse": {}
github.com/databricks/databricks-sdk-go/service/apps.AppResourceJob:
"id":
"description": |-
Id of the job to grant permission on.
"permission":
"description": |-
Permissions to grant on the Job. Supported permissions are: "CAN_MANAGE", "IS_OWNER", "CAN_MANAGE_RUN", "CAN_VIEW".
github.com/databricks/databricks-sdk-go/service/apps.AppResourceJobJobPermission:
"_":
"enum":
- |-
CAN_MANAGE
- |-
IS_OWNER
- |-
CAN_MANAGE_RUN
- |-
CAN_VIEW
github.com/databricks/databricks-sdk-go/service/apps.AppResourceSecret:
"key":
"description": |-
Key of the secret to grant permission on.
"permission":
"description": |-
Permission to grant on the secret scope. For secrets, only one permission is allowed. Permission must be one of: "READ", "WRITE", "MANAGE".
"scope":
"description": |-
Scope of the secret to grant permission on.
github.com/databricks/databricks-sdk-go/service/apps.AppResourceSecretSecretPermission:
"_":
"description": |-
Permission to grant on the secret scope. Supported permissions are: "READ", "WRITE", "MANAGE".
"enum":
- |-
READ
- |-
WRITE
- |-
MANAGE
github.com/databricks/databricks-sdk-go/service/apps.AppResourceServingEndpoint:
"name":
"description": |-
Name of the serving endpoint to grant permission on.
"permission":
"description": |-
Permission to grant on the serving endpoint. Supported permissions are: "CAN_MANAGE", "CAN_QUERY", "CAN_VIEW".
github.com/databricks/databricks-sdk-go/service/apps.AppResourceServingEndpointServingEndpointPermission:
"_":
"enum":
- |-
CAN_MANAGE
- |-
CAN_QUERY
- |-
CAN_VIEW
github.com/databricks/databricks-sdk-go/service/apps.AppResourceSqlWarehouse:
"id":
"description": |-
Id of the SQL warehouse to grant permission on.
"permission":
"description": |-
Permission to grant on the SQL warehouse. Supported permissions are: "CAN_MANAGE", "CAN_USE", "IS_OWNER".
github.com/databricks/databricks-sdk-go/service/apps.AppResourceSqlWarehouseSqlWarehousePermission:
"_":
"enum":
- |-
CAN_MANAGE
- |-
CAN_USE
- |-
IS_OWNER
github.com/databricks/databricks-sdk-go/service/apps.ApplicationState:
"_":
"enum":
- |-
DEPLOYING
- |-
RUNNING
- |-
CRASHED
- |-
UNAVAILABLE
github.com/databricks/databricks-sdk-go/service/apps.ApplicationStatus:
"message":
"description": |-
Application status message
"state":
"description": |-
State of the application.
github.com/databricks/databricks-sdk-go/service/apps.ComputeState:
"_":
"enum":
- |-
ERROR
- |-
DELETING
- |-
STARTING
- |-
STOPPING
- |-
UPDATING
- |-
STOPPED
- |-
ACTIVE
github.com/databricks/databricks-sdk-go/service/apps.ComputeStatus:
"message":
"description": |-
Compute status message
"state":
"description": |-
State of the app compute.
github.com/databricks/databricks-sdk-go/service/catalog.MonitorCronSchedule: github.com/databricks/databricks-sdk-go/service/catalog.MonitorCronSchedule:
"pause_status": "pause_status":
"description": |- "description": |-
@ -2116,6 +2342,26 @@ github.com/databricks/databricks-sdk-go/service/ml.ModelVersionTag:
github.com/databricks/databricks-sdk-go/service/pipelines.CronTrigger: github.com/databricks/databricks-sdk-go/service/pipelines.CronTrigger:
"quartz_cron_schedule": {} "quartz_cron_schedule": {}
"timezone_id": {} "timezone_id": {}
github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek:
"_":
"description": |-
Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).
If not specified all days of the week will be used.
"enum":
- |-
MONDAY
- |-
TUESDAY
- |-
WEDNESDAY
- |-
THURSDAY
- |-
FRIDAY
- |-
SATURDAY
- |-
SUNDAY
github.com/databricks/databricks-sdk-go/service/pipelines.DeploymentKind: github.com/databricks/databricks-sdk-go/service/pipelines.DeploymentKind:
"_": "_":
"description": | "description": |
@ -2375,26 +2621,6 @@ github.com/databricks/databricks-sdk-go/service/pipelines.RestartWindow:
"description": |- "description": |-
Time zone id of restart window. See https://docs.databricks.com/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-timezone.html for details. Time zone id of restart window. See https://docs.databricks.com/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-timezone.html for details.
If not specified, UTC will be used. If not specified, UTC will be used.
github.com/databricks/databricks-sdk-go/service/pipelines.RestartWindowDaysOfWeek:
"_":
"description": |-
Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).
If not specified all days of the week will be used.
"enum":
- |-
MONDAY
- |-
TUESDAY
- |-
WEDNESDAY
- |-
THURSDAY
- |-
FRIDAY
- |-
SATURDAY
- |-
SUNDAY
github.com/databricks/databricks-sdk-go/service/pipelines.SchemaSpec: github.com/databricks/databricks-sdk-go/service/pipelines.SchemaSpec:
"destination_catalog": "destination_catalog":
"description": |- "description": |-

View File

@ -1,3 +1,28 @@
github.com/databricks/cli/bundle/config/resources.App:
"app_status":
"description": |-
PLACEHOLDER
"compute_status":
"description": |-
PLACEHOLDER
"config":
"description": |-
PLACEHOLDER
"permissions":
"description": |-
PLACEHOLDER
"service_principal_client_id":
"description": |-
PLACEHOLDER
"service_principal_id":
"description": |-
PLACEHOLDER
"service_principal_name":
"description": |-
PLACEHOLDER
"source_code_path":
"description": |-
PLACEHOLDER
github.com/databricks/cli/bundle/config/resources.Cluster: github.com/databricks/cli/bundle/config/resources.Cluster:
"data_security_mode": "data_security_mode":
"description": |- "description": |-
@ -75,6 +100,19 @@ github.com/databricks/cli/bundle/config/resources.Volume:
"volume_type": "volume_type":
"description": |- "description": |-
PLACEHOLDER PLACEHOLDER
github.com/databricks/databricks-sdk-go/service/apps.AppResource:
"job":
"description": |-
PLACEHOLDER
"secret":
"description": |-
PLACEHOLDER
"serving_endpoint":
"description": |-
PLACEHOLDER
"sql_warehouse":
"description": |-
PLACEHOLDER
github.com/databricks/databricks-sdk-go/service/compute.AwsAttributes: github.com/databricks/databricks-sdk-go/service/compute.AwsAttributes:
"availability": "availability":
"description": |- "description": |-

View File

@ -388,7 +388,7 @@
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/jobs.JobsHealthRules" "$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/jobs.JobsHealthRules"
}, },
"job_clusters": { "job_clusters": {
"description": "A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.", "description": "A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.\nIf more than 100 job clusters are available, you can paginate through them using :method:jobs/get.",
"$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/jobs.JobCluster" "$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/jobs.JobCluster"
}, },
"max_concurrent_runs": { "max_concurrent_runs": {
@ -426,7 +426,7 @@
"$ref": "#/$defs/map/string" "$ref": "#/$defs/map/string"
}, },
"tasks": { "tasks": {
"description": "A list of task specifications to be executed by this job.", "description": "A list of task specifications to be executed by this job.\nIf more than 100 tasks are available, you can paginate through them using :method:jobs/get. Use the `next_page_token` field at the object root to determine if more results are available.",
"$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/jobs.Task" "$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/jobs.Task"
}, },
"timeout_seconds": { "timeout_seconds": {
@ -1662,10 +1662,20 @@
] ]
}, },
"apps.AppDeploymentMode": { "apps.AppDeploymentMode": {
"type": "string" "type": "string",
"enum": [
"SNAPSHOT",
"AUTO_SYNC"
]
}, },
"apps.AppDeploymentState": { "apps.AppDeploymentState": {
"type": "string" "type": "string",
"enum": [
"SUCCEEDED",
"FAILED",
"IN_PROGRESS",
"CANCELLED"
]
}, },
"apps.AppDeploymentStatus": { "apps.AppDeploymentStatus": {
"oneOf": [ "oneOf": [
@ -1747,7 +1757,13 @@
] ]
}, },
"apps.AppResourceJobJobPermission": { "apps.AppResourceJobJobPermission": {
"type": "string" "type": "string",
"enum": [
"CAN_MANAGE",
"IS_OWNER",
"CAN_MANAGE_RUN",
"CAN_VIEW"
]
}, },
"apps.AppResourceSecret": { "apps.AppResourceSecret": {
"oneOf": [ "oneOf": [
@ -1778,7 +1794,13 @@
] ]
}, },
"apps.AppResourceSecretSecretPermission": { "apps.AppResourceSecretSecretPermission": {
"type": "string" "type": "string",
"description": "Permission to grant on the secret scope. Supported permissions are: \"READ\", \"WRITE\", \"MANAGE\".",
"enum": [
"READ",
"WRITE",
"MANAGE"
]
}, },
"apps.AppResourceServingEndpoint": { "apps.AppResourceServingEndpoint": {
"oneOf": [ "oneOf": [
@ -1805,7 +1827,12 @@
] ]
}, },
"apps.AppResourceServingEndpointServingEndpointPermission": { "apps.AppResourceServingEndpointServingEndpointPermission": {
"type": "string" "type": "string",
"enum": [
"CAN_MANAGE",
"CAN_QUERY",
"CAN_VIEW"
]
}, },
"apps.AppResourceSqlWarehouse": { "apps.AppResourceSqlWarehouse": {
"oneOf": [ "oneOf": [
@ -1832,10 +1859,21 @@
] ]
}, },
"apps.AppResourceSqlWarehouseSqlWarehousePermission": { "apps.AppResourceSqlWarehouseSqlWarehousePermission": {
"type": "string" "type": "string",
"enum": [
"CAN_MANAGE",
"CAN_USE",
"IS_OWNER"
]
}, },
"apps.ApplicationState": { "apps.ApplicationState": {
"type": "string" "type": "string",
"enum": [
"DEPLOYING",
"RUNNING",
"CRASHED",
"UNAVAILABLE"
]
}, },
"apps.ApplicationStatus": { "apps.ApplicationStatus": {
"oneOf": [ "oneOf": [
@ -1858,7 +1896,16 @@
] ]
}, },
"apps.ComputeState": { "apps.ComputeState": {
"type": "string" "type": "string",
"enum": [
"ERROR",
"DELETING",
"STARTING",
"STOPPING",
"UPDATING",
"STOPPED",
"ACTIVE"
]
}, },
"apps.ComputeStatus": { "apps.ComputeStatus": {
"oneOf": [ "oneOf": [
@ -4569,6 +4616,19 @@
} }
] ]
}, },
"pipelines.DayOfWeek": {
"type": "string",
"description": "Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).\nIf not specified all days of the week will be used.",
"enum": [
"MONDAY",
"TUESDAY",
"WEDNESDAY",
"THURSDAY",
"FRIDAY",
"SATURDAY",
"SUNDAY"
]
},
"pipelines.DeploymentKind": { "pipelines.DeploymentKind": {
"type": "string", "type": "string",
"description": "The deployment method that manages the pipeline:\n- BUNDLE: The pipeline is managed by a Databricks Asset Bundle.\n", "description": "The deployment method that manages the pipeline:\n- BUNDLE: The pipeline is managed by a Databricks Asset Bundle.\n",
@ -5003,7 +5063,7 @@
"properties": { "properties": {
"days_of_week": { "days_of_week": {
"description": "Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).\nIf not specified all days of the week will be used.", "description": "Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).\nIf not specified all days of the week will be used.",
"$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/pipelines.RestartWindowDaysOfWeek" "$ref": "#/$defs/slice/github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek"
}, },
"start_hour": { "start_hour": {
"description": "An integer between 0 and 23 denoting the start hour for the restart window in the 24-hour day.\nContinuous pipeline restart is triggered only within a five-hour window starting at this hour.", "description": "An integer between 0 and 23 denoting the start hour for the restart window in the 24-hour day.\nContinuous pipeline restart is triggered only within a five-hour window starting at this hour.",
@ -5025,19 +5085,6 @@
} }
] ]
}, },
"pipelines.RestartWindowDaysOfWeek": {
"type": "string",
"description": "Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).\nIf not specified all days of the week will be used.",
"enum": [
"MONDAY",
"TUESDAY",
"WEDNESDAY",
"THURSDAY",
"FRIDAY",
"SATURDAY",
"SUNDAY"
]
},
"pipelines.SchemaSpec": { "pipelines.SchemaSpec": {
"oneOf": [ "oneOf": [
{ {
@ -6619,6 +6666,20 @@
} }
] ]
}, },
"pipelines.DayOfWeek": {
"oneOf": [
{
"type": "array",
"items": {
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek"
}
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)+)\\}"
}
]
},
"pipelines.IngestionConfig": { "pipelines.IngestionConfig": {
"oneOf": [ "oneOf": [
{ {
@ -6675,20 +6736,6 @@
} }
] ]
}, },
"pipelines.RestartWindowDaysOfWeek": {
"oneOf": [
{
"type": "array",
"items": {
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/pipelines.RestartWindowDaysOfWeek"
}
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)+)\\}"
}
]
},
"serving.AiGatewayRateLimit": { "serving.AiGatewayRateLimit": {
"oneOf": [ "oneOf": [
{ {

View File

@ -111,7 +111,7 @@ func newCreate() *cobra.Command {
cmd.Flags().Var(&createJson, "json", `either inline JSON string or @path/to/file.json with request body`) cmd.Flags().Var(&createJson, "json", `either inline JSON string or @path/to/file.json with request body`)
cmd.Flags().StringVar(&createReq.Policy.Description, "description", createReq.Policy.Description, `Description of the federation policy.`) cmd.Flags().StringVar(&createReq.Policy.Description, "description", createReq.Policy.Description, `Description of the federation policy.`)
cmd.Flags().StringVar(&createReq.Policy.Name, "name", createReq.Policy.Name, `Name of the federation policy.`) cmd.Flags().StringVar(&createReq.Policy.Name, "name", createReq.Policy.Name, `Resource name for the federation policy.`)
// TODO: complex arg: oidc_policy // TODO: complex arg: oidc_policy
cmd.Use = "create" cmd.Use = "create"
@ -180,7 +180,10 @@ func newDelete() *cobra.Command {
cmd.Use = "delete POLICY_ID" cmd.Use = "delete POLICY_ID"
cmd.Short = `Delete account federation policy.` cmd.Short = `Delete account federation policy.`
cmd.Long = `Delete account federation policy.` cmd.Long = `Delete account federation policy.
Arguments:
POLICY_ID: The identifier for the federation policy.`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
@ -233,7 +236,10 @@ func newGet() *cobra.Command {
cmd.Use = "get POLICY_ID" cmd.Use = "get POLICY_ID"
cmd.Short = `Get account federation policy.` cmd.Short = `Get account federation policy.`
cmd.Long = `Get account federation policy.` cmd.Long = `Get account federation policy.
Arguments:
POLICY_ID: The identifier for the federation policy.`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
@ -339,24 +345,20 @@ func newUpdate() *cobra.Command {
cmd.Flags().Var(&updateJson, "json", `either inline JSON string or @path/to/file.json with request body`) cmd.Flags().Var(&updateJson, "json", `either inline JSON string or @path/to/file.json with request body`)
cmd.Flags().StringVar(&updateReq.Policy.Description, "description", updateReq.Policy.Description, `Description of the federation policy.`) cmd.Flags().StringVar(&updateReq.Policy.Description, "description", updateReq.Policy.Description, `Description of the federation policy.`)
cmd.Flags().StringVar(&updateReq.Policy.Name, "name", updateReq.Policy.Name, `Name of the federation policy.`) cmd.Flags().StringVar(&updateReq.Policy.Name, "name", updateReq.Policy.Name, `Resource name for the federation policy.`)
// TODO: complex arg: oidc_policy // TODO: complex arg: oidc_policy
cmd.Use = "update POLICY_ID UPDATE_MASK" cmd.Use = "update POLICY_ID"
cmd.Short = `Update account federation policy.` cmd.Short = `Update account federation policy.`
cmd.Long = `Update account federation policy. cmd.Long = `Update account federation policy.
Arguments: Arguments:
POLICY_ID: POLICY_ID: The identifier for the federation policy.`
UPDATE_MASK: Field mask is required to be passed into the PATCH request. Field mask
specifies which fields of the setting payload will be updated. The field
mask needs to be supplied as single string. To specify multiple fields in
the field mask, use comma as the separator (no space).`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
cmd.Args = func(cmd *cobra.Command, args []string) error { cmd.Args = func(cmd *cobra.Command, args []string) error {
check := root.ExactArgs(2) check := root.ExactArgs(1)
return check(cmd, args) return check(cmd, args)
} }
@ -378,7 +380,6 @@ func newUpdate() *cobra.Command {
} }
} }
updateReq.PolicyId = args[0] updateReq.PolicyId = args[0]
updateReq.UpdateMask = args[1]
response, err := a.FederationPolicy.Update(ctx, updateReq) response, err := a.FederationPolicy.Update(ctx, updateReq)
if err != nil { if err != nil {

View File

@ -118,7 +118,7 @@ func newCreate() *cobra.Command {
cmd.Flags().Var(&createJson, "json", `either inline JSON string or @path/to/file.json with request body`) cmd.Flags().Var(&createJson, "json", `either inline JSON string or @path/to/file.json with request body`)
cmd.Flags().StringVar(&createReq.Policy.Description, "description", createReq.Policy.Description, `Description of the federation policy.`) cmd.Flags().StringVar(&createReq.Policy.Description, "description", createReq.Policy.Description, `Description of the federation policy.`)
cmd.Flags().StringVar(&createReq.Policy.Name, "name", createReq.Policy.Name, `Name of the federation policy.`) cmd.Flags().StringVar(&createReq.Policy.Name, "name", createReq.Policy.Name, `Resource name for the federation policy.`)
// TODO: complex arg: oidc_policy // TODO: complex arg: oidc_policy
cmd.Use = "create SERVICE_PRINCIPAL_ID" cmd.Use = "create SERVICE_PRINCIPAL_ID"
@ -198,7 +198,7 @@ func newDelete() *cobra.Command {
Arguments: Arguments:
SERVICE_PRINCIPAL_ID: The service principal id for the federation policy. SERVICE_PRINCIPAL_ID: The service principal id for the federation policy.
POLICY_ID: ` POLICY_ID: The identifier for the federation policy.`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
@ -259,7 +259,7 @@ func newGet() *cobra.Command {
Arguments: Arguments:
SERVICE_PRINCIPAL_ID: The service principal id for the federation policy. SERVICE_PRINCIPAL_ID: The service principal id for the federation policy.
POLICY_ID: ` POLICY_ID: The identifier for the federation policy.`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
@ -377,25 +377,21 @@ func newUpdate() *cobra.Command {
cmd.Flags().Var(&updateJson, "json", `either inline JSON string or @path/to/file.json with request body`) cmd.Flags().Var(&updateJson, "json", `either inline JSON string or @path/to/file.json with request body`)
cmd.Flags().StringVar(&updateReq.Policy.Description, "description", updateReq.Policy.Description, `Description of the federation policy.`) cmd.Flags().StringVar(&updateReq.Policy.Description, "description", updateReq.Policy.Description, `Description of the federation policy.`)
cmd.Flags().StringVar(&updateReq.Policy.Name, "name", updateReq.Policy.Name, `Name of the federation policy.`) cmd.Flags().StringVar(&updateReq.Policy.Name, "name", updateReq.Policy.Name, `Resource name for the federation policy.`)
// TODO: complex arg: oidc_policy // TODO: complex arg: oidc_policy
cmd.Use = "update SERVICE_PRINCIPAL_ID POLICY_ID UPDATE_MASK" cmd.Use = "update SERVICE_PRINCIPAL_ID POLICY_ID"
cmd.Short = `Update service principal federation policy.` cmd.Short = `Update service principal federation policy.`
cmd.Long = `Update service principal federation policy. cmd.Long = `Update service principal federation policy.
Arguments: Arguments:
SERVICE_PRINCIPAL_ID: The service principal id for the federation policy. SERVICE_PRINCIPAL_ID: The service principal id for the federation policy.
POLICY_ID: POLICY_ID: The identifier for the federation policy.`
UPDATE_MASK: Field mask is required to be passed into the PATCH request. Field mask
specifies which fields of the setting payload will be updated. The field
mask needs to be supplied as single string. To specify multiple fields in
the field mask, use comma as the separator (no space).`
cmd.Annotations = make(map[string]string) cmd.Annotations = make(map[string]string)
cmd.Args = func(cmd *cobra.Command, args []string) error { cmd.Args = func(cmd *cobra.Command, args []string) error {
check := root.ExactArgs(3) check := root.ExactArgs(2)
return check(cmd, args) return check(cmd, args)
} }
@ -421,7 +417,6 @@ func newUpdate() *cobra.Command {
return fmt.Errorf("invalid SERVICE_PRINCIPAL_ID: %s", args[0]) return fmt.Errorf("invalid SERVICE_PRINCIPAL_ID: %s", args[0])
} }
updateReq.PolicyId = args[1] updateReq.PolicyId = args[1]
updateReq.UpdateMask = args[2]
response, err := a.ServicePrincipalFederationPolicy.Update(ctx, updateReq) response, err := a.ServicePrincipalFederationPolicy.Update(ctx, updateReq)
if err != nil { if err != nil {

View File

@ -625,11 +625,19 @@ func newGet() *cobra.Command {
// TODO: short flags // TODO: short flags
cmd.Flags().StringVar(&getReq.PageToken, "page-token", getReq.PageToken, `Use next_page_token returned from the previous GetJob to request the next page of the job's sub-resources.`)
cmd.Use = "get JOB_ID" cmd.Use = "get JOB_ID"
cmd.Short = `Get a single job.` cmd.Short = `Get a single job.`
cmd.Long = `Get a single job. cmd.Long = `Get a single job.
Retrieves the details for a single job. Retrieves the details for a single job.
In Jobs API 2.2, requests for a single job support pagination of tasks and
job_clusters when either exceeds 100 elements. Use the next_page_token
field to check for more results and pass its value as the page_token in
subsequent requests. Arrays with fewer than 100 elements in a page will be
empty on later pages.
Arguments: Arguments:
JOB_ID: The canonical identifier of the job to retrieve information about. This JOB_ID: The canonical identifier of the job to retrieve information about. This
@ -847,13 +855,19 @@ func newGetRun() *cobra.Command {
cmd.Flags().BoolVar(&getRunReq.IncludeHistory, "include-history", getRunReq.IncludeHistory, `Whether to include the repair history in the response.`) cmd.Flags().BoolVar(&getRunReq.IncludeHistory, "include-history", getRunReq.IncludeHistory, `Whether to include the repair history in the response.`)
cmd.Flags().BoolVar(&getRunReq.IncludeResolvedValues, "include-resolved-values", getRunReq.IncludeResolvedValues, `Whether to include resolved parameter values in the response.`) cmd.Flags().BoolVar(&getRunReq.IncludeResolvedValues, "include-resolved-values", getRunReq.IncludeResolvedValues, `Whether to include resolved parameter values in the response.`)
cmd.Flags().StringVar(&getRunReq.PageToken, "page-token", getRunReq.PageToken, `To list the next page of job tasks, set this field to the value of the next_page_token returned in the GetJob response.`) cmd.Flags().StringVar(&getRunReq.PageToken, "page-token", getRunReq.PageToken, `Use next_page_token returned from the previous GetRun to request the next page of the run's sub-resources.`)
cmd.Use = "get-run RUN_ID" cmd.Use = "get-run RUN_ID"
cmd.Short = `Get a single job run.` cmd.Short = `Get a single job run.`
cmd.Long = `Get a single job run. cmd.Long = `Get a single job run.
Retrieve the metadata of a run. Retrieves the metadata of a run.
In Jobs API 2.2, requests for a single job run support pagination of tasks
and job_clusters when either exceeds 100 elements. Use the next_page_token
field to check for more results and pass its value as the page_token in
subsequent requests. Arrays with fewer than 100 elements in a page will be
empty on later pages.
Arguments: Arguments:
RUN_ID: The canonical identifier of the run for which to retrieve the metadata. RUN_ID: The canonical identifier of the run for which to retrieve the metadata.

View File

@ -974,6 +974,7 @@ func newUpdate() *cobra.Command {
cmd.Flags().BoolVar(&updateReq.Photon, "photon", updateReq.Photon, `Whether Photon is enabled for this pipeline.`) cmd.Flags().BoolVar(&updateReq.Photon, "photon", updateReq.Photon, `Whether Photon is enabled for this pipeline.`)
cmd.Flags().StringVar(&updateReq.PipelineId, "pipeline-id", updateReq.PipelineId, `Unique identifier for this pipeline.`) cmd.Flags().StringVar(&updateReq.PipelineId, "pipeline-id", updateReq.PipelineId, `Unique identifier for this pipeline.`)
// TODO: complex arg: restart_window // TODO: complex arg: restart_window
// TODO: complex arg: run_as
cmd.Flags().StringVar(&updateReq.Schema, "schema", updateReq.Schema, `The default schema (database) where tables are read from or published to.`) cmd.Flags().StringVar(&updateReq.Schema, "schema", updateReq.Schema, `The default schema (database) where tables are read from or published to.`)
cmd.Flags().BoolVar(&updateReq.Serverless, "serverless", updateReq.Serverless, `Whether serverless compute is enabled for this pipeline.`) cmd.Flags().BoolVar(&updateReq.Serverless, "serverless", updateReq.Serverless, `Whether serverless compute is enabled for this pipeline.`)
cmd.Flags().StringVar(&updateReq.Storage, "storage", updateReq.Storage, `DBFS root directory for storing checkpoints and tables.`) cmd.Flags().StringVar(&updateReq.Storage, "storage", updateReq.Storage, `DBFS root directory for storing checkpoints and tables.`)

View File

@ -391,6 +391,7 @@ func newUpdate() *cobra.Command {
cmd.Flags().StringVar(&updateReq.Comment, "comment", updateReq.Comment, `User-provided free-form text description.`) cmd.Flags().StringVar(&updateReq.Comment, "comment", updateReq.Comment, `User-provided free-form text description.`)
cmd.Flags().StringVar(&updateReq.NewName, "new-name", updateReq.NewName, `New name for the share.`) cmd.Flags().StringVar(&updateReq.NewName, "new-name", updateReq.NewName, `New name for the share.`)
cmd.Flags().StringVar(&updateReq.Owner, "owner", updateReq.Owner, `Username of current owner of share.`)
cmd.Flags().StringVar(&updateReq.StorageRoot, "storage-root", updateReq.StorageRoot, `Storage root URL for the share.`) cmd.Flags().StringVar(&updateReq.StorageRoot, "storage-root", updateReq.StorageRoot, `Storage root URL for the share.`)
// TODO: array: updates // TODO: array: updates

2
go.mod
View File

@ -7,7 +7,7 @@ toolchain go1.23.4
require ( require (
github.com/Masterminds/semver/v3 v3.3.1 // MIT github.com/Masterminds/semver/v3 v3.3.1 // MIT
github.com/briandowns/spinner v1.23.1 // Apache 2.0 github.com/briandowns/spinner v1.23.1 // Apache 2.0
github.com/databricks/databricks-sdk-go v0.54.0 // Apache 2.0 github.com/databricks/databricks-sdk-go v0.55.0 // Apache 2.0
github.com/fatih/color v1.18.0 // MIT github.com/fatih/color v1.18.0 // MIT
github.com/google/uuid v1.6.0 // BSD-3-Clause github.com/google/uuid v1.6.0 // BSD-3-Clause
github.com/hashicorp/go-version v1.7.0 // MPL 2.0 github.com/hashicorp/go-version v1.7.0 // MPL 2.0

4
go.sum generated
View File

@ -32,8 +32,8 @@ github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGX
github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o= github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
github.com/cyphar/filepath-securejoin v0.2.5 h1:6iR5tXJ/e6tJZzzdMc1km3Sa7RRIVBKAK32O2s7AYfo= github.com/cyphar/filepath-securejoin v0.2.5 h1:6iR5tXJ/e6tJZzzdMc1km3Sa7RRIVBKAK32O2s7AYfo=
github.com/cyphar/filepath-securejoin v0.2.5/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4= github.com/cyphar/filepath-securejoin v0.2.5/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
github.com/databricks/databricks-sdk-go v0.54.0 h1:L8gsA3NXs+uYU3QtW/OUgjxMQxOH24k0MT9JhB3zLlM= github.com/databricks/databricks-sdk-go v0.55.0 h1:ReziD6spzTDltM0ml80LggKo27F3oUjgTinCFDJDnak=
github.com/databricks/databricks-sdk-go v0.54.0/go.mod h1:ds+zbv5mlQG7nFEU5ojLtgN/u0/9YzZmKQES/CfedzU= github.com/databricks/databricks-sdk-go v0.55.0/go.mod h1:JpLizplEs+up9/Z4Xf2x++o3sM9eTTWFGzIXAptKJzI=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=