databricks-cli/go.mod

65 lines
2.7 KiB
Modula-2
Raw Normal View History

module github.com/databricks/cli
2022-05-13 13:30:22 +00:00
go 1.18
2022-05-13 13:30:22 +00:00
require (
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/briandowns/spinner v1.23.0 // Apache 2.0
github.com/databricks/databricks-sdk-go v0.14.1 // Apache 2.0
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/fatih/color v1.15.0 // MIT
2022-05-14 17:56:09 +00:00
github.com/ghodss/yaml v1.0.0 // MIT + NOTICE
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/google/uuid v1.3.0 // BSD-3-Clause
github.com/hashicorp/go-version v1.6.0 // MPL 2.0
github.com/hashicorp/hc-install v0.5.2 // MPL 2.0
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/hashicorp/terraform-exec v0.18.1 // MPL 2.0
github.com/hashicorp/terraform-json v0.17.1 // MPL 2.0
github.com/imdario/mergo v0.3.13 // BSD-3-Clause
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/manifoldco/promptui v0.9.0 // BSD-3-Clause
github.com/mattn/go-isatty v0.0.19 // MIT
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/nwidger/jsoncolor v0.3.2 // MIT
2022-07-07 18:56:59 +00:00
github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 // BSD-2-Clause
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06 // MIT
github.com/spf13/cobra v1.7.0 // Apache 2.0
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/spf13/pflag v1.0.5 // BSD-3-Clause
github.com/stretchr/testify v1.8.4 // MIT
2022-05-14 17:56:09 +00:00
github.com/whilp/git-urls v1.0.0 // MIT
golang.org/x/exp v0.0.0-20230310171629-522b1b587ee0
golang.org/x/mod v0.12.0
Bump golang.org/x/oauth2 from 0.10.0 to 0.11.0 (#641) Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from 0.10.0 to 0.11.0. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/golang/oauth2/commit/2e4a4e2bfb69ca7609cb423438c55caa131431c1"><code>2e4a4e2</code></a> go.mod: update golang.org/x dependencies</li> <li><a href="https://github.com/golang/oauth2/commit/ac6658e9cb5802cebf9b8fd5f5d58f22bedb527f"><code>ac6658e</code></a> all: update go version to 1.18</li> <li>See full diff in <a href="https://github.com/golang/oauth2/compare/v0.10.0...v0.11.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.10.0&new-version=0.11.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-08 12:17:48 +00:00
golang.org/x/oauth2 v0.11.0
golang.org/x/sync v0.3.0
Bump golang.org/x/term from 0.10.0 to 0.11.0 (#643) Bumps [golang.org/x/term](https://github.com/golang/term) from 0.10.0 to 0.11.0. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/golang/term/commit/19e73c2b80b7ca68dd9c38657aa5cf5db9424a3c"><code>19e73c2</code></a> go.mod: update golang.org/x dependencies</li> <li>See full diff in <a href="https://github.com/golang/term/compare/v0.10.0...v0.11.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.10.0&new-version=0.11.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-08 10:09:02 +00:00
golang.org/x/term v0.11.0
Bump golang.org/x/text from 0.11.0 to 0.12.0 (#642) Bumps [golang.org/x/text](https://github.com/golang/text) from 0.11.0 to 0.12.0. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/golang/text/commit/fb697c0580b4b6ab0a21ca17e64788b981fb6018"><code>fb697c0</code></a> cmd/gotext: actually use -dir flag</li> <li><a href="https://github.com/golang/text/commit/f3e69ed4a8ab60c16ae76f4ddb08f2726b0a9428"><code>f3e69ed</code></a> cmd/gotext: fix misbehaviors</li> <li><a href="https://github.com/golang/text/commit/ab07ad1b65bc4cdc738e747f7569a3795d2e60ec"><code>ab07ad1</code></a> all: remove repetitive words</li> <li>See full diff in <a href="https://github.com/golang/text/compare/v0.11.0...v0.12.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/text&package-manager=go_modules&previous-version=0.11.0&new-version=0.12.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-08 11:59:33 +00:00
golang.org/x/text v0.12.0
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
gopkg.in/ini.v1 v1.67.0 // Apache 2.0
)
require (
cloud.google.com/go/compute v1.20.1 // indirect
2023-02-20 15:00:20 +00:00
cloud.google.com/go/compute/metadata v0.2.3 // indirect
github.com/ProtonMail/go-crypto v0.0.0-20230217124315-7d5c6f04bbb8 // indirect
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/apparentlymart/go-textseg/v13 v13.0.0 // indirect
github.com/chzyer/readline v0.0.0-20180603132655-2972be24d48e // indirect
github.com/cloudflare/circl v1.3.3 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
github.com/golang/protobuf v1.5.3 // indirect
github.com/google/go-querystring v1.1.0 // indirect
github.com/google/s2a-go v0.1.4 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.2.5 // indirect
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/hashicorp/go-cleanhttp v0.5.2 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
Added OpenAPI command coverage (#357) This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
github.com/mattn/go-colorable v0.1.13 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/zclconf/go-cty v1.13.2 // indirect
go.opencensus.io v0.24.0 // indirect
Bump golang.org/x/oauth2 from 0.10.0 to 0.11.0 (#641) Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from 0.10.0 to 0.11.0. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/golang/oauth2/commit/2e4a4e2bfb69ca7609cb423438c55caa131431c1"><code>2e4a4e2</code></a> go.mod: update golang.org/x dependencies</li> <li><a href="https://github.com/golang/oauth2/commit/ac6658e9cb5802cebf9b8fd5f5d58f22bedb527f"><code>ac6658e</code></a> all: update go version to 1.18</li> <li>See full diff in <a href="https://github.com/golang/oauth2/compare/v0.10.0...v0.11.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.10.0&new-version=0.11.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-08 12:17:48 +00:00
golang.org/x/crypto v0.12.0 // indirect
golang.org/x/net v0.14.0 // indirect
Bump golang.org/x/term from 0.10.0 to 0.11.0 (#643) Bumps [golang.org/x/term](https://github.com/golang/term) from 0.10.0 to 0.11.0. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/golang/term/commit/19e73c2b80b7ca68dd9c38657aa5cf5db9424a3c"><code>19e73c2</code></a> go.mod: update golang.org/x dependencies</li> <li>See full diff in <a href="https://github.com/golang/term/compare/v0.10.0...v0.11.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.10.0&new-version=0.11.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-08 10:09:02 +00:00
golang.org/x/sys v0.11.0 // indirect
2023-03-09 13:29:53 +00:00
golang.org/x/time v0.3.0 // indirect
Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 (#585) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.12.0 to 0.13.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.13.0</h2> <ul> <li>Add issue templates (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li> <li>Added HasRequiredNonBodyField method (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li> <li>Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li> <li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li> <li>Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li> <li>Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li> <li>Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li> <li>Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Removed <code>Maintenance</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>EnableOptimization</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a> workspace-level service.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li> <li>Added <code>Schema</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li> <li>Added <code>JobSource</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>RunJobOutput</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>NotificationSettings</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.13.0</h2> <ul> <li>Add issue templates (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li> <li>Added HasRequiredNonBodyField method (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li> <li>Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li> <li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li> <li>Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li> <li>Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li> <li>Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li> <li>Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Removed <code>Maintenance</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>EnableOptimization</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a> workspace-level service.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li> <li>Added <code>Schema</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li> <li>Added <code>JobSource</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>RunJobOutput</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>NotificationSettings</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/b4fb746b3b2001ef2c9a3519d2ac32f4f1cebc43"><code>b4fb746</code></a> Release v0.13.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/555">#555</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/180c7eea4c595924ccbab8b2d5953df95af2cd1a"><code>180c7ee</code></a> Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/540">#540</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/546814a272733a6c83ebf96f1f722ba7106c100b"><code>546814a</code></a> Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/554">#554</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/7e680c5ba80348dcda779049848b4841aeb21e4a"><code>7e680c5</code></a> Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/550">#550</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/e71ece4ccd7b02c879c0c66cb8c6342d88c9562c"><code>e71ece4</code></a> Bump google.golang.org/api from 0.130.0 to 0.131.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/551">#551</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/3b4492b6d659ca3d03035e76e1bbc2c964d44f19"><code>3b4492b</code></a> Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/549">#549</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f84de6111a5d8cc70a0e650af2a2dbb592d16ecc"><code>f84de61</code></a> Refactor Handling of Name&lt;-&gt;ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/547">#547</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/c37a894872ebd61db111d1539de82789cc0f733d"><code>c37a894</code></a> Bump google.golang.org/api from 0.129.0 to 0.130.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/542">#542</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/4f2aa38e75ab59c6bdd356a73e885a023d65df51"><code>4f2aa38</code></a> Bump golang.org/x/mod from 0.11.0 to 0.12.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/541">#541</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/e80f6e16ffa1d48efbf70aca65c6db1891bbf2c8"><code>e80f6e1</code></a> Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/544">#544</a>)</li> <li>Additional commits viewable in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.12.0...v0.13.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.12.0&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Serge Smertin <serge.smertin@databricks.com>
2023-07-18 15:30:00 +00:00
google.golang.org/api v0.131.0 // indirect
google.golang.org/appengine v1.6.7 // indirect
Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 (#585) Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.12.0 to 0.13.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.13.0</h2> <ul> <li>Add issue templates (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li> <li>Added HasRequiredNonBodyField method (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li> <li>Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li> <li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li> <li>Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li> <li>Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li> <li>Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li> <li>Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Removed <code>Maintenance</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>EnableOptimization</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a> workspace-level service.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li> <li>Added <code>Schema</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li> <li>Added <code>JobSource</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>RunJobOutput</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>NotificationSettings</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.13.0</h2> <ul> <li>Add issue templates (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li> <li>Added HasRequiredNonBodyField method (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li> <li>Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li> <li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li> <li>Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li> <li>Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li> <li>Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li> <li>Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Removed <code>Maintenance</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>EnableOptimization</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a> workspace-level service.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li> <li>Added <code>Force</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li> <li>Added <code>Schema</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li> <li>Added <code>JobSource</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>TriggerInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li> <li>Added <code>RunJobOutput</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <code>EmailNotifications</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>NotificationSettings</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li> <li>Added <code>Health</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>RunJobTask</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li> <li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/b4fb746b3b2001ef2c9a3519d2ac32f4f1cebc43"><code>b4fb746</code></a> Release v0.13.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/555">#555</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/180c7eea4c595924ccbab8b2d5953df95af2cd1a"><code>180c7ee</code></a> Added SCIM Patch Acceptance Tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/540">#540</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/546814a272733a6c83ebf96f1f722ba7106c100b"><code>546814a</code></a> Prepare for auto-releaser infra (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/554">#554</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/7e680c5ba80348dcda779049848b4841aeb21e4a"><code>7e680c5</code></a> Parse Camel Case and Pascal Case Enum Values (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/550">#550</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/e71ece4ccd7b02c879c0c66cb8c6342d88c9562c"><code>e71ece4</code></a> Bump google.golang.org/api from 0.130.0 to 0.131.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/551">#551</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/3b4492b6d659ca3d03035e76e1bbc2c964d44f19"><code>3b4492b</code></a> Regenerate Go SDK from current OpenAPI Specification (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/549">#549</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/f84de6111a5d8cc70a0e650af2a2dbb592d16ecc"><code>f84de61</code></a> Refactor Handling of Name&lt;-&gt;ID Mapping in OpenAPI Generator (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/547">#547</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/c37a894872ebd61db111d1539de82789cc0f733d"><code>c37a894</code></a> Bump google.golang.org/api from 0.129.0 to 0.130.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/542">#542</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/4f2aa38e75ab59c6bdd356a73e885a023d65df51"><code>4f2aa38</code></a> Bump golang.org/x/mod from 0.11.0 to 0.12.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/541">#541</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/e80f6e16ffa1d48efbf70aca65c6db1891bbf2c8"><code>e80f6e1</code></a> Make Azure MSI auth account compatible (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/544">#544</a>)</li> <li>Additional commits viewable in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.12.0...v0.13.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.12.0&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Serge Smertin <serge.smertin@databricks.com>
2023-07-18 15:30:00 +00:00
google.golang.org/genproto/googleapis/rpc v0.0.0-20230706204954-ccb25ca9f130 // indirect
google.golang.org/grpc v1.56.2 // indirect
google.golang.org/protobuf v1.31.0 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)