databricks-cli/cmd/workspace/query-history/query-history.go

103 lines
3.0 KiB
Go
Raw Normal View History

Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT.
package query_history
import (
"github.com/databricks/cli/cmd/root"
"github.com/databricks/cli/libs/cmdio"
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
"github.com/databricks/databricks-sdk-go/service/sql"
"github.com/spf13/cobra"
)
// Slice with functions to override default command behavior.
// Functions can be added from the `init()` function in manually curated files in this directory.
var cmdOverrides []func(*cobra.Command)
func New() *cobra.Command {
cmd := &cobra.Command{
Use: "query-history",
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 (#1719) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.44.0 to 0.45.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.45.0</h2> <h2>0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/6d867882d057545f3d5136929a449c513ed401af"><code>6d86788</code></a> [Release] Release v0.45.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023">#1023</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/ba4489b946040db111e22bd4c14b24b66512dc5a"><code>ba4489b</code></a> [Fix] Do not specify <code>--tenant</code> flag when fetching managed identity access to...</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f6248097d13dceeb8a6f0b67b19de4d7a8de227d"><code>f624809</code></a> [Internal] Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019">#1019</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/27a50556090fd63ffcf7b6edd4150f414587923d"><code>27a5055</code></a> [Internal] Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017">#1017</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/382a38d380bde46f5b4a24abf68de370f85af62b"><code>382a38d</code></a> [Internal] Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016">#1016</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/1ef9931dc98885d8359796f691306854341662ac"><code>1ef9931</code></a> [Fix] Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014">#1014</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
Short: `A service responsible for storing and retrieving the list of queries run against SQL endpoints and serverless compute.`,
Long: `A service responsible for storing and retrieving the list of queries run
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 (#1719) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.44.0 to 0.45.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.45.0</h2> <h2>0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/6d867882d057545f3d5136929a449c513ed401af"><code>6d86788</code></a> [Release] Release v0.45.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023">#1023</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/ba4489b946040db111e22bd4c14b24b66512dc5a"><code>ba4489b</code></a> [Fix] Do not specify <code>--tenant</code> flag when fetching managed identity access to...</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f6248097d13dceeb8a6f0b67b19de4d7a8de227d"><code>f624809</code></a> [Internal] Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019">#1019</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/27a50556090fd63ffcf7b6edd4150f414587923d"><code>27a5055</code></a> [Internal] Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017">#1017</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/382a38d380bde46f5b4a24abf68de370f85af62b"><code>382a38d</code></a> [Internal] Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016">#1016</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/1ef9931dc98885d8359796f691306854341662ac"><code>1ef9931</code></a> [Fix] Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014">#1014</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
against SQL endpoints and serverless compute.`,
GroupID: "sql",
Annotations: map[string]string{
"package": "sql",
},
}
// Add methods
cmd.AddCommand(newList())
// Apply optional overrides to this command.
for _, fn := range cmdOverrides {
fn(cmd)
}
return cmd
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
}
// start list command
// Slice with functions to override default command behavior.
// Functions can be added from the `init()` function in manually curated files in this directory.
var listOverrides []func(
*cobra.Command,
*sql.ListQueryHistoryRequest,
)
func newList() *cobra.Command {
cmd := &cobra.Command{}
var listReq sql.ListQueryHistoryRequest
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
// TODO: short flags
// TODO: complex arg: filter_by
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 (#1719) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.44.0 to 0.45.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.45.0</h2> <h2>0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/6d867882d057545f3d5136929a449c513ed401af"><code>6d86788</code></a> [Release] Release v0.45.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023">#1023</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/ba4489b946040db111e22bd4c14b24b66512dc5a"><code>ba4489b</code></a> [Fix] Do not specify <code>--tenant</code> flag when fetching managed identity access to...</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f6248097d13dceeb8a6f0b67b19de4d7a8de227d"><code>f624809</code></a> [Internal] Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019">#1019</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/27a50556090fd63ffcf7b6edd4150f414587923d"><code>27a5055</code></a> [Internal] Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017">#1017</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/382a38d380bde46f5b4a24abf68de370f85af62b"><code>382a38d</code></a> [Internal] Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016">#1016</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/1ef9931dc98885d8359796f691306854341662ac"><code>1ef9931</code></a> [Fix] Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014">#1014</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
cmd.Flags().BoolVar(&listReq.IncludeMetrics, "include-metrics", listReq.IncludeMetrics, `Whether to include the query metrics with each query.`)
cmd.Flags().IntVar(&listReq.MaxResults, "max-results", listReq.MaxResults, `Limit the number of results returned in one page.`)
cmd.Flags().StringVar(&listReq.PageToken, "page-token", listReq.PageToken, `A token that can be used to get the next page of results.`)
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
cmd.Use = "list"
cmd.Short = `List Queries.`
cmd.Long = `List Queries.
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 (#1719) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.44.0 to 0.45.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.45.0</h2> <h2>0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.45.0</h2> <h3>Bug Fixes</h3> <ul> <li>Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li> <li>Do not specify <code>--tenant</code> flag when fetching managed identity access token from the CLI (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li> <li>Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li> <li>Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li> <li>Added <code>Fallback</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>JobRunId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>IncludeMetrics</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li> <li>Added <code>StatementIds</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li> <li>Removed <code>ContextFilter</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li> <li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li> </ul> <p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/6d867882d057545f3d5136929a449c513ed401af"><code>6d86788</code></a> [Release] Release v0.45.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023">#1023</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/ba4489b946040db111e22bd4c14b24b66512dc5a"><code>ba4489b</code></a> [Fix] Do not specify <code>--tenant</code> flag when fetching managed identity access to...</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f6248097d13dceeb8a6f0b67b19de4d7a8de227d"><code>f624809</code></a> [Internal] Fix billing test for budget configuration update (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019">#1019</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/27a50556090fd63ffcf7b6edd4150f414587923d"><code>27a5055</code></a> [Internal] Add terraform aliases to Entity (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017">#1017</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/382a38d380bde46f5b4a24abf68de370f85af62b"><code>382a38d</code></a> [Internal] Added Service.NamedIdMap (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016">#1016</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/1ef9931dc98885d8359796f691306854341662ac"><code>1ef9931</code></a> [Fix] Add INVALID_STATE to error code mapping (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014">#1014</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
List the history of queries through SQL warehouses, and serverless compute.
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
You can filter by user ID, warehouse ID, status, and time range. Most recently
started queries are returned first (up to max_results in request). The
pagination token returned in response can be used to list subsequent query
statuses.`
cmd.Annotations = make(map[string]string)
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
cmd.Args = func(cmd *cobra.Command, args []string) error {
check := root.ExactArgs(0)
return check(cmd, args)
}
cmd.PreRunE = root.MustWorkspaceClient
cmd.RunE = func(cmd *cobra.Command, args []string) (err error) {
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
ctx := cmd.Context()
w := root.WorkspaceClient(ctx)
response, err := w.QueryHistory.List(ctx, listReq)
if err != nil {
return err
}
return cmdio.Render(ctx, response)
}
// Disable completions since they are not applicable.
// Can be overridden by manual implementation in `override.go`.
cmd.ValidArgsFunction = cobra.NoFileCompletions
// Apply optional overrides to this command.
for _, fn := range listOverrides {
fn(cmd, &listReq)
}
return cmd
}
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
// end service QueryHistory