mirror of https://github.com/databricks/cli.git
Bump github.com/databricks/databricks-sdk-go from 0.45.0 to 0.46.0 (#1760)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.45.0 to 0.46.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.46.0</h2> <h3>Bug Fixes</h3> <ul> <li>Fail fast when authenticating if host is not configured (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1033">#1033</a>).</li> <li>Improve non-JSON error handling (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1031">#1031</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add TestAccCreateOboTokenOnAws to flaky test list (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1029">#1029</a>).</li> <li>Add workflows manage integration tests checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1032">#1032</a>).</li> <li>Fix TestMwsAccWorkspaces cleanup (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1028">#1028</a>).</li> <li>Improve integration test comment (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1035">#1035</a>).</li> <li>Temporary ignore Metastore test failures (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1027">#1027</a>).</li> <li>Update test to support new accounts (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1026">#1026</a>).</li> <li>Use statuses instead of checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1036">#1036</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>RegenerateDashboard</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI">w.QualityMonitors</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardRequest">catalog.RegenerateDashboardRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardResponse">catalog.RegenerateDashboardResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetails">jobs.QueueDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetailsCodeCode">jobs.QueueDetailsCodeCode</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunLifecycleStateV2State">jobs.RunLifecycleStateV2State</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunStatus">jobs.RunStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationCodeCode">jobs.TerminationCodeCode</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationDetails">jobs.TerminationDetails</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationTypeType">jobs.TerminationTypeType</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairHistoryItem">jobs.RepairHistoryItem</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>MaxProvisionedThroughput</code> and <code>MinProvisionedThroughput</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>.</li> <li>Added <code>ColumnsToSync</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest">vectorsearch.DeltaSyncVectorIndexSpecRequest</a>.</li> <li>Changed <code>WorkloadSize</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a> to no longer be required.</li> </ul> <p>OpenAPI SHA: d05898328669a3f8ab0c2ecee37db2673d3ea3f7, Date: 2024-09-04</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.46.0</h2> <h3>Bug Fixes</h3> <ul> <li>Fail fast when authenticating if host is not configured (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1033">#1033</a>).</li> <li>Improve non-JSON error handling (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1031">#1031</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add TestAccCreateOboTokenOnAws to flaky test list (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1029">#1029</a>).</li> <li>Add workflows manage integration tests checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1032">#1032</a>).</li> <li>Fix TestMwsAccWorkspaces cleanup (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1028">#1028</a>).</li> <li>Improve integration test comment (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1035">#1035</a>).</li> <li>Temporary ignore Metastore test failures (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1027">#1027</a>).</li> <li>Update test to support new accounts (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1026">#1026</a>).</li> <li>Use statuses instead of checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1036">#1036</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>RegenerateDashboard</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI">w.QualityMonitors</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardRequest">catalog.RegenerateDashboardRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardResponse">catalog.RegenerateDashboardResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetails">jobs.QueueDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetailsCodeCode">jobs.QueueDetailsCodeCode</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunLifecycleStateV2State">jobs.RunLifecycleStateV2State</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunStatus">jobs.RunStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationCodeCode">jobs.TerminationCodeCode</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationDetails">jobs.TerminationDetails</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationTypeType">jobs.TerminationTypeType</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairHistoryItem">jobs.RepairHistoryItem</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>MaxProvisionedThroughput</code> and <code>MinProvisionedThroughput</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>.</li> <li>Added <code>ColumnsToSync</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest">vectorsearch.DeltaSyncVectorIndexSpecRequest</a>.</li> <li>Changed <code>WorkloadSize</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a> to no longer be required.</li> </ul> <p>OpenAPI SHA: d05898328669a3f8ab0c2ecee37db2673d3ea3f7, Date: 2024-09-04</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="37cb031019
"><code>37cb031</code></a> [Release] Release v0.46.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1037">#1037</a>)</li> <li><a href="34f37f9e4c
"><code>34f37f9</code></a> [Internal] Use statuses instead of checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1036">#1036</a>)</li> <li><a href="590d597046
"><code>590d597</code></a> [Internal] Improve integration test comment (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1035">#1035</a>)</li> <li><a href="6ab81eed78
"><code>6ab81ee</code></a> [Internal] Add workflows manage integration tests checks (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1032">#1032</a>)</li> <li><a href="4886afe312
"><code>4886afe</code></a> [Fix] Fail fast when authenticating if host is not configured (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1033">#1033</a>)</li> <li><a href="796dae1674
"><code>796dae1</code></a> [Fix] Handle non-JSON errors (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1031">#1031</a>)</li> <li><a href="a24a158b34
"><code>a24a158</code></a> [Internal] Add TestAccCreateOboTokenOnAws to flaky test list (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1029">#1029</a>)</li> <li><a href="9ab8b42bc4
"><code>9ab8b42</code></a> [Fix] Fix TestMwsAccWorkspaces cleanup (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1028">#1028</a>)</li> <li><a href="cc22621c96
"><code>cc22621</code></a> [Internal] Temporary ignore Metastore test failures (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1027">#1027</a>)</li> <li><a href="8dbaaf2767
"><code>8dbaaf2</code></a> [Fix] Update test to support new accounts (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1026">#1026</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.45.0...v0.46.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.45.0&new-version=0.46.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
This commit is contained in:
parent
0ef1ada14b
commit
d3e221a116
|
@ -1 +1 @@
|
||||||
3eae49b444cac5a0118a3503e5b7ecef7f96527a
|
d05898328669a3f8ab0c2ecee37db2673d3ea3f7
|
|
@ -2046,6 +2046,12 @@
|
||||||
"instance_profile_arn": {
|
"instance_profile_arn": {
|
||||||
"description": "ARN of the instance profile that the served model will use to access AWS resources."
|
"description": "ARN of the instance profile that the served model will use to access AWS resources."
|
||||||
},
|
},
|
||||||
|
"max_provisioned_throughput": {
|
||||||
|
"description": "The maximum tokens per second that the endpoint can scale up to."
|
||||||
|
},
|
||||||
|
"min_provisioned_throughput": {
|
||||||
|
"description": "The minimum tokens per second that the endpoint can scale down to."
|
||||||
|
},
|
||||||
"model_name": {
|
"model_name": {
|
||||||
"description": "The name of the model in Databricks Model Registry to be served or if the model resides in Unity Catalog, the full name of model,\nin the form of __catalog_name__.__schema_name__.__model_name__.\n"
|
"description": "The name of the model in Databricks Model Registry to be served or if the model resides in Unity Catalog, the full name of model,\nin the form of __catalog_name__.__schema_name__.__model_name__.\n"
|
||||||
},
|
},
|
||||||
|
@ -5147,6 +5153,12 @@
|
||||||
"instance_profile_arn": {
|
"instance_profile_arn": {
|
||||||
"description": "ARN of the instance profile that the served model will use to access AWS resources."
|
"description": "ARN of the instance profile that the served model will use to access AWS resources."
|
||||||
},
|
},
|
||||||
|
"max_provisioned_throughput": {
|
||||||
|
"description": "The maximum tokens per second that the endpoint can scale up to."
|
||||||
|
},
|
||||||
|
"min_provisioned_throughput": {
|
||||||
|
"description": "The minimum tokens per second that the endpoint can scale down to."
|
||||||
|
},
|
||||||
"model_name": {
|
"model_name": {
|
||||||
"description": "The name of the model in Databricks Model Registry to be served or if the model resides in Unity Catalog, the full name of model,\nin the form of __catalog_name__.__schema_name__.__model_name__.\n"
|
"description": "The name of the model in Databricks Model Registry to be served or if the model resides in Unity Catalog, the full name of model,\nin the form of __catalog_name__.__schema_name__.__model_name__.\n"
|
||||||
},
|
},
|
||||||
|
|
|
@ -941,7 +941,12 @@ func newListArtifacts() *cobra.Command {
|
||||||
cmd.Long = `Get all artifacts.
|
cmd.Long = `Get all artifacts.
|
||||||
|
|
||||||
List artifacts for a run. Takes an optional artifact_path prefix. If it is
|
List artifacts for a run. Takes an optional artifact_path prefix. If it is
|
||||||
specified, the response contains only artifacts with the specified prefix.",`
|
specified, the response contains only artifacts with the specified prefix.
|
||||||
|
This API does not support pagination when listing artifacts in UC Volumes. A
|
||||||
|
maximum of 1000 artifacts will be retrieved for UC Volumes. Please call
|
||||||
|
/api/2.0/fs/directories{directory_path} for listing artifacts in UC Volumes,
|
||||||
|
which supports pagination. See [List directory contents | Files
|
||||||
|
API](/api/workspace/files/listdirectorycontents).`
|
||||||
|
|
||||||
cmd.Annotations = make(map[string]string)
|
cmd.Annotations = make(map[string]string)
|
||||||
|
|
||||||
|
|
|
@ -88,7 +88,9 @@ func newAssign() *cobra.Command {
|
||||||
Arguments:
|
Arguments:
|
||||||
WORKSPACE_ID: A workspace ID.
|
WORKSPACE_ID: A workspace ID.
|
||||||
METASTORE_ID: The unique ID of the metastore.
|
METASTORE_ID: The unique ID of the metastore.
|
||||||
DEFAULT_CATALOG_NAME: The name of the default catalog in the metastore.`
|
DEFAULT_CATALOG_NAME: The name of the default catalog in the metastore. This field is depracted.
|
||||||
|
Please use "Default Namespace API" to configure the default catalog for a
|
||||||
|
Databricks workspace.`
|
||||||
|
|
||||||
cmd.Annotations = make(map[string]string)
|
cmd.Annotations = make(map[string]string)
|
||||||
|
|
||||||
|
@ -665,7 +667,7 @@ func newUpdateAssignment() *cobra.Command {
|
||||||
// TODO: short flags
|
// TODO: short flags
|
||||||
cmd.Flags().Var(&updateAssignmentJson, "json", `either inline JSON string or @path/to/file.json with request body`)
|
cmd.Flags().Var(&updateAssignmentJson, "json", `either inline JSON string or @path/to/file.json with request body`)
|
||||||
|
|
||||||
cmd.Flags().StringVar(&updateAssignmentReq.DefaultCatalogName, "default-catalog-name", updateAssignmentReq.DefaultCatalogName, `The name of the default catalog for the metastore.`)
|
cmd.Flags().StringVar(&updateAssignmentReq.DefaultCatalogName, "default-catalog-name", updateAssignmentReq.DefaultCatalogName, `The name of the default catalog in the metastore.`)
|
||||||
cmd.Flags().StringVar(&updateAssignmentReq.MetastoreId, "metastore-id", updateAssignmentReq.MetastoreId, `The unique ID of the metastore.`)
|
cmd.Flags().StringVar(&updateAssignmentReq.MetastoreId, "metastore-id", updateAssignmentReq.MetastoreId, `The unique ID of the metastore.`)
|
||||||
|
|
||||||
cmd.Use = "update-assignment WORKSPACE_ID"
|
cmd.Use = "update-assignment WORKSPACE_ID"
|
||||||
|
|
|
@ -117,9 +117,10 @@ func newGet() *cobra.Command {
|
||||||
|
|
||||||
Arguments:
|
Arguments:
|
||||||
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
||||||
authorization, clusters, cluster-policies, dbsql-dashboards, directories,
|
authorization, clusters, cluster-policies, dashboards, dbsql-dashboards,
|
||||||
experiments, files, instance-pools, jobs, notebooks, pipelines, queries,
|
directories, experiments, files, instance-pools, jobs, notebooks,
|
||||||
registered-models, repos, serving-endpoints, or warehouses.
|
pipelines, queries, registered-models, repos, serving-endpoints, or
|
||||||
|
warehouses.
|
||||||
REQUEST_OBJECT_ID: The id of the request object.`
|
REQUEST_OBJECT_ID: The id of the request object.`
|
||||||
|
|
||||||
cmd.Annotations = make(map[string]string)
|
cmd.Annotations = make(map[string]string)
|
||||||
|
@ -245,9 +246,10 @@ func newSet() *cobra.Command {
|
||||||
|
|
||||||
Arguments:
|
Arguments:
|
||||||
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
||||||
authorization, clusters, cluster-policies, dbsql-dashboards, directories,
|
authorization, clusters, cluster-policies, dashboards, dbsql-dashboards,
|
||||||
experiments, files, instance-pools, jobs, notebooks, pipelines, queries,
|
directories, experiments, files, instance-pools, jobs, notebooks,
|
||||||
registered-models, repos, serving-endpoints, or warehouses.
|
pipelines, queries, registered-models, repos, serving-endpoints, or
|
||||||
|
warehouses.
|
||||||
REQUEST_OBJECT_ID: The id of the request object.`
|
REQUEST_OBJECT_ID: The id of the request object.`
|
||||||
|
|
||||||
cmd.Annotations = make(map[string]string)
|
cmd.Annotations = make(map[string]string)
|
||||||
|
@ -319,9 +321,10 @@ func newUpdate() *cobra.Command {
|
||||||
|
|
||||||
Arguments:
|
Arguments:
|
||||||
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
REQUEST_OBJECT_TYPE: The type of the request object. Can be one of the following: alerts,
|
||||||
authorization, clusters, cluster-policies, dbsql-dashboards, directories,
|
authorization, clusters, cluster-policies, dashboards, dbsql-dashboards,
|
||||||
experiments, files, instance-pools, jobs, notebooks, pipelines, queries,
|
directories, experiments, files, instance-pools, jobs, notebooks,
|
||||||
registered-models, repos, serving-endpoints, or warehouses.
|
pipelines, queries, registered-models, repos, serving-endpoints, or
|
||||||
|
warehouses.
|
||||||
REQUEST_OBJECT_ID: The id of the request object.`
|
REQUEST_OBJECT_ID: The id of the request object.`
|
||||||
|
|
||||||
cmd.Annotations = make(map[string]string)
|
cmd.Annotations = make(map[string]string)
|
||||||
|
|
|
@ -41,6 +41,7 @@ func New() *cobra.Command {
|
||||||
cmd.AddCommand(newGet())
|
cmd.AddCommand(newGet())
|
||||||
cmd.AddCommand(newGetRefresh())
|
cmd.AddCommand(newGetRefresh())
|
||||||
cmd.AddCommand(newListRefreshes())
|
cmd.AddCommand(newListRefreshes())
|
||||||
|
cmd.AddCommand(newRegenerateDashboard())
|
||||||
cmd.AddCommand(newRunRefresh())
|
cmd.AddCommand(newRunRefresh())
|
||||||
cmd.AddCommand(newUpdate())
|
cmd.AddCommand(newUpdate())
|
||||||
|
|
||||||
|
@ -503,6 +504,87 @@ func newListRefreshes() *cobra.Command {
|
||||||
return cmd
|
return cmd
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// start regenerate-dashboard command
|
||||||
|
|
||||||
|
// Slice with functions to override default command behavior.
|
||||||
|
// Functions can be added from the `init()` function in manually curated files in this directory.
|
||||||
|
var regenerateDashboardOverrides []func(
|
||||||
|
*cobra.Command,
|
||||||
|
*catalog.RegenerateDashboardRequest,
|
||||||
|
)
|
||||||
|
|
||||||
|
func newRegenerateDashboard() *cobra.Command {
|
||||||
|
cmd := &cobra.Command{}
|
||||||
|
|
||||||
|
var regenerateDashboardReq catalog.RegenerateDashboardRequest
|
||||||
|
var regenerateDashboardJson flags.JsonFlag
|
||||||
|
|
||||||
|
// TODO: short flags
|
||||||
|
cmd.Flags().Var(®enerateDashboardJson, "json", `either inline JSON string or @path/to/file.json with request body`)
|
||||||
|
|
||||||
|
cmd.Flags().StringVar(®enerateDashboardReq.WarehouseId, "warehouse-id", regenerateDashboardReq.WarehouseId, `Optional argument to specify the warehouse for dashboard regeneration.`)
|
||||||
|
|
||||||
|
cmd.Use = "regenerate-dashboard TABLE_NAME"
|
||||||
|
cmd.Short = `Regenerate a monitoring dashboard.`
|
||||||
|
cmd.Long = `Regenerate a monitoring dashboard.
|
||||||
|
|
||||||
|
Regenerates the monitoring dashboard for the specified table.
|
||||||
|
|
||||||
|
The caller must either: 1. be an owner of the table's parent catalog 2. have
|
||||||
|
**USE_CATALOG** on the table's parent catalog and be an owner of the table's
|
||||||
|
parent schema 3. have the following permissions: - **USE_CATALOG** on the
|
||||||
|
table's parent catalog - **USE_SCHEMA** on the table's parent schema - be an
|
||||||
|
owner of the table
|
||||||
|
|
||||||
|
The call must be made from the workspace where the monitor was created. The
|
||||||
|
dashboard will be regenerated in the assets directory that was specified when
|
||||||
|
the monitor was created.
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
TABLE_NAME: Full name of the table.`
|
||||||
|
|
||||||
|
// This command is being previewed; hide from help output.
|
||||||
|
cmd.Hidden = true
|
||||||
|
|
||||||
|
cmd.Annotations = make(map[string]string)
|
||||||
|
|
||||||
|
cmd.Args = func(cmd *cobra.Command, args []string) error {
|
||||||
|
check := root.ExactArgs(1)
|
||||||
|
return check(cmd, args)
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd.PreRunE = root.MustWorkspaceClient
|
||||||
|
cmd.RunE = func(cmd *cobra.Command, args []string) (err error) {
|
||||||
|
ctx := cmd.Context()
|
||||||
|
w := root.WorkspaceClient(ctx)
|
||||||
|
|
||||||
|
if cmd.Flags().Changed("json") {
|
||||||
|
err = regenerateDashboardJson.Unmarshal(®enerateDashboardReq)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
regenerateDashboardReq.TableName = args[0]
|
||||||
|
|
||||||
|
response, err := w.QualityMonitors.RegenerateDashboard(ctx, regenerateDashboardReq)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return cmdio.Render(ctx, response)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Disable completions since they are not applicable.
|
||||||
|
// Can be overridden by manual implementation in `override.go`.
|
||||||
|
cmd.ValidArgsFunction = cobra.NoFileCompletions
|
||||||
|
|
||||||
|
// Apply optional overrides to this command.
|
||||||
|
for _, fn := range regenerateDashboardOverrides {
|
||||||
|
fn(cmd, ®enerateDashboardReq)
|
||||||
|
}
|
||||||
|
|
||||||
|
return cmd
|
||||||
|
}
|
||||||
|
|
||||||
// start run-refresh command
|
// start run-refresh command
|
||||||
|
|
||||||
// Slice with functions to override default command behavior.
|
// Slice with functions to override default command behavior.
|
||||||
|
|
2
go.mod
2
go.mod
|
@ -5,7 +5,7 @@ go 1.22
|
||||||
require (
|
require (
|
||||||
github.com/Masterminds/semver/v3 v3.3.0 // MIT
|
github.com/Masterminds/semver/v3 v3.3.0 // MIT
|
||||||
github.com/briandowns/spinner v1.23.1 // Apache 2.0
|
github.com/briandowns/spinner v1.23.1 // Apache 2.0
|
||||||
github.com/databricks/databricks-sdk-go v0.45.0 // Apache 2.0
|
github.com/databricks/databricks-sdk-go v0.46.0 // Apache 2.0
|
||||||
github.com/fatih/color v1.17.0 // MIT
|
github.com/fatih/color v1.17.0 // MIT
|
||||||
github.com/ghodss/yaml v1.0.0 // MIT + NOTICE
|
github.com/ghodss/yaml v1.0.0 // MIT + NOTICE
|
||||||
github.com/google/uuid v1.6.0 // BSD-3-Clause
|
github.com/google/uuid v1.6.0 // BSD-3-Clause
|
||||||
|
|
|
@ -32,8 +32,8 @@ github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGX
|
||||||
github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
|
github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
|
||||||
github.com/cyphar/filepath-securejoin v0.2.4 h1:Ugdm7cg7i6ZK6x3xDF1oEu1nfkyfH53EtKeQYTC3kyg=
|
github.com/cyphar/filepath-securejoin v0.2.4 h1:Ugdm7cg7i6ZK6x3xDF1oEu1nfkyfH53EtKeQYTC3kyg=
|
||||||
github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
|
github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
|
||||||
github.com/databricks/databricks-sdk-go v0.45.0 h1:wdx5Wm/ESrahdHeq62WrjLeGjV4r722LLanD8ahI0Mo=
|
github.com/databricks/databricks-sdk-go v0.46.0 h1:D0TxmtSVAOsdnfzH4OGtAmcq+8TyA7Z6fA6JEYhupeY=
|
||||||
github.com/databricks/databricks-sdk-go v0.45.0/go.mod h1:ds+zbv5mlQG7nFEU5ojLtgN/u0/9YzZmKQES/CfedzU=
|
github.com/databricks/databricks-sdk-go v0.46.0/go.mod h1:ds+zbv5mlQG7nFEU5ojLtgN/u0/9YzZmKQES/CfedzU=
|
||||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
|
Loading…
Reference in New Issue