Commit Graph

12 Commits

Author SHA1 Message Date
Serge Smertin 4c4a293015
Added OpenAPI command coverage (#357)
This PR adds the following command groups:

## Workspace-level command groups

 * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
 * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
 * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
 * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
 * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
 * `bricks dashboards` - In general, there is little need to modify dashboards using the API.
 * `bricks data-sources` - This API is provided to assist you in making new query objects.
 * `bricks experiments` - MLflow Experiment tracking.
 * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
 * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
 * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
 * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
 * `bricks grants` - In Unity Catalog, data is secure by default.
 * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
 * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
 * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
 * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
 * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
 * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
 * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
 * `bricks model-registry` - MLflow Model Registry commands.
 * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
 * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
 * `bricks policy-families` - View available policy families.
 * `bricks providers` - Databricks Providers REST API.
 * `bricks queries` - These endpoints are used for CRUD operations on query definitions.
 * `bricks query-history` - Access the history of queries through SQL warehouses.
 * `bricks recipient-activation` - Databricks Recipient Activation REST API.
 * `bricks recipients` - Databricks Recipients REST API.
 * `bricks repos` - The Repos API allows users to manage their git repos.
 * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
 * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
 * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
 * `bricks shares` - Databricks Shares REST API.
 * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
 * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
 * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
 * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
 * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
 * `bricks users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
 * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
 * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
 * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.

## Account-level command groups

 * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
 * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
 * `bricks account credentials` - These APIs manage credential configurations for this workspace.
 * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
 * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
 * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
 * `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
 * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
 * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
 * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
 * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
 * `bricks account private-access` - These APIs manage private access settings for this account.
 * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks account storage` - These APIs manage storage configurations for this workspace.
 * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
 * `bricks account users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
 * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
 * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 13:06:16 +02:00
shreyas-goenka 43bc9a0d9d
Use cmdio logger to log bricks cmd execution errors (#348)
## Changes
Uses the cmdio logger to log the execution error

## Tests
Manually by making the root command return fake errors. Here is the
output:
```
shreyas.goenka@THW32HFW6T bricks % bricks bundle validate
Error: my foo error
```

```
shreyas.goenka@THW32HFW6T bricks % bricks bundle validate --progress-format=json
{
  "error": "my foo error"
}
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-04-24 12:11:52 +02:00
Pieter Noordhuis c26c7d388a
Update command descriptions (#353)
Cool tagline to be determined.
2023-04-21 13:41:25 +02:00
shreyas-goenka 1fc903943d
Log os.Args, bricks version, and exit status (#324)
## Changes
<!-- Summary of your changes that are easy to understand -->
1. Log os.Args and bricks version before every command execution
2. After a command execution, logs the error and exit code

## Tests
<!-- How is this tested? -->
Manually, 

case 1: Run `bricks version` successfully
```
shreyas.goenka@THW32HFW6T bricks % bricks version --log-level=info --log-file stderr
time=2023-04-12T00:15:04.011+02:00 level=INFO source=root.go:34 msg="process args: [bricks, version, --log-level=info, --log-file, stderr]"
time=2023-04-12T00:15:04.011+02:00 level=INFO source=root.go:35 msg="version: 0.0.0-dev+375eb1c50283"
0.0.0-dev+375eb1c50283
time=2023-04-12T00:15:04.011+02:00 level=INFO source=root.go:68 msg="exit code: 0"
```

case 2: Run `bricks bundle deploy` in a working dir where `bundle.yml`
does not exist
```
shreyas.goenka@THW32HFW6T bricks % bricks bundle deploy --log-level=info --log-file=stderr
time=2023-04-12T00:19:16.783+02:00 level=INFO source=root.go:34 msg="process args: [bricks, bundle, deploy, --log-level=info, --log-file=stderr]"
time=2023-04-12T00:19:16.784+02:00 level=INFO source=root.go:35 msg="version: 0.0.0-dev+375eb1c50283"
Error: unable to locate bundle root: bundle.yml not found
time=2023-04-12T00:19:16.784+02:00 level=ERROR source=root.go:64 msg="unable to locate bundle root: bundle.yml not found"
time=2023-04-12T00:19:16.784+02:00 level=ERROR source=root.go:65 msg="exit code: 1"
```
2023-04-12 22:12:36 +02:00
shreyas-goenka 8fd3dccca9
Add progress logs for job runs (#276) 2023-03-29 14:58:09 +02:00
Pieter Noordhuis 32a29c6af4
Add structured logging infrastructure (#246)
New global flags:
* `--log-file FILE`: can be literal `stdout`, `stderr`, or a file name (default `stderr`)
* `--log-level LEVEL`: can be `error`, `warn`, `info`, `debug`, `trace`, or `disabled` (default `disabled`)
* `--log-format TYPE`: can be `text` or `json` (default `text`)

New functions in the `log` package take a `context.Context` and retrieve
the logger from said context.

Because we carry the logger in a context, adding
[attributes](https://pkg.go.dev/golang.org/x/exp/slog#hdr-Attrs_and_Values)
to the logger can be done as follows:

```go
ctx = log.NewContext(ctx, log.GetLogger(ctx).With("foo", "bar"))
```
2023-03-16 14:46:53 +01:00
Pieter Noordhuis 2e01473902
Let caller set BRICKS_UPSTREAM for user agent (#196)
Example when called from vscode (and everything is hooked up):

```
> * User-Agent: bricks/0.0.21-devel databricks-sdk-go/0.2.0 go/1.19.4 os/darwin upstream/databricks-vscode
```
2023-02-03 17:05:58 +01:00
Pieter Noordhuis 9ca7f8a888
Configure user agent in root command (#195)
This configures the user agent with the bricks version and the name of
the command being executed.

Example user agent value:
```
> * User-Agent: bricks/0.0.21-devel databricks-sdk-go/0.2.0 go/1.19.4 os/darwin cmd/sync auth/pat
```

This is a follow up for #194.
2023-02-03 16:47:33 +01:00
Pieter Noordhuis a354fa1f77
Only display usage string on flag errors (#147) 2022-12-21 11:38:30 +01:00
Pieter Noordhuis 38a9dabcbe
Add command to make API calls (#80)
Not settled whether this should live as a top level command or hidden
under some debug scope. Either way, the ability to make arbitrary API
calls and leverage unified auth is a super useful tool.
2022-10-10 10:27:45 +02:00
Serge Smertin ae2dc104f9 add some comments to commands package 2022-05-20 20:43:29 +02:00
Serge Smertin 4e8955085e moved commands to own packages 2022-05-14 19:54:35 +02:00