Commit Graph

23 Commits

Author SHA1 Message Date
Gleb Kanterov 25737bbb5d
Add regression tests for CLI error output (#1566)
## Changes
Add regression tests for https://github.com/databricks/cli/issues/1563

We test 2 code paths:
- if there is an error, we can print to stderr
- if there is a valid output, we can print to stdout

We should also consider adding black-box tests that will run the CLI
binary as a black box and inspect its output to stderr/stdout.

## Tests
Unit tests
2024-07-10 06:38:06 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
Pieter Noordhuis bee7a16cb0
Remove dependency on global state for remaining commands (#613)
## Changes

This removes the remaining dependency on global state and unblocks work
to parallelize integration tests. As is, we can already uncomment an
integration test that had to be skipped because of other tests tainting
global state. This is no longer an issue.

Also see #595 and #606.

## Tests

* Unit and integration tests pass.
* Manually confirmed the help output is the same.
2023-07-27 10:03:08 +00:00
Pieter Noordhuis 3fa400f00f
Remove dependency on global state in generated commands (#595)
## Changes

Generated commands relied on global variables for flags and request
payloads. This is difficult to test if a sequence of tests tries to run
the same command with various arguments because the global state causes
test interference. Moreover, it is impossible to run tests in parallel.

This change modifies the approach and turns every command group and
command itself into a function that returns a `*cobra.Command`. All
flags and request payloads are variables scoped to the command's
initialization function. This means it is possible to construct
independent copies of the CLI structure and fixes the test isolation
issue.

The scope of this change is only the generated commands. The other
commands will be changed accordingly in subsequent changes.

## Tests

Unit and integration tests pass.
2023-07-25 20:19:07 +02:00
Pieter Noordhuis 98ebb78c9b
Rename bricks -> databricks (#389)
## Changes

Rename all instances of "bricks" to "databricks".

## Tests

* Confirmed the goreleaser build works, uses the correct new binary
name, and produces the right archives.
* Help output is confirmed to be correct.
* Output of `git grep -w bricks` is minimal with a couple changes
remaining for after the repository rename.
2023-05-16 18:35:39 +02:00
Serge Smertin 4c4a293015
Added OpenAPI command coverage (#357)
This PR adds the following command groups:

## Workspace-level command groups

 * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
 * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
 * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
 * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
 * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
 * `bricks dashboards` - In general, there is little need to modify dashboards using the API.
 * `bricks data-sources` - This API is provided to assist you in making new query objects.
 * `bricks experiments` - MLflow Experiment tracking.
 * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
 * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
 * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
 * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
 * `bricks grants` - In Unity Catalog, data is secure by default.
 * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
 * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
 * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
 * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
 * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
 * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
 * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
 * `bricks model-registry` - MLflow Model Registry commands.
 * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
 * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
 * `bricks policy-families` - View available policy families.
 * `bricks providers` - Databricks Providers REST API.
 * `bricks queries` - These endpoints are used for CRUD operations on query definitions.
 * `bricks query-history` - Access the history of queries through SQL warehouses.
 * `bricks recipient-activation` - Databricks Recipient Activation REST API.
 * `bricks recipients` - Databricks Recipients REST API.
 * `bricks repos` - The Repos API allows users to manage their git repos.
 * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
 * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
 * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
 * `bricks shares` - Databricks Shares REST API.
 * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
 * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
 * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
 * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
 * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
 * `bricks users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
 * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
 * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
 * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.

## Account-level command groups

 * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
 * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
 * `bricks account credentials` - These APIs manage credential configurations for this workspace.
 * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
 * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
 * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
 * `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
 * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
 * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
 * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
 * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
 * `bricks account private-access` - These APIs manage private access settings for this account.
 * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks account storage` - These APIs manage storage configurations for this workspace.
 * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
 * `bricks account users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
 * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
 * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 13:06:16 +02:00
Pieter Noordhuis 66c6eef3e5
Reinstate configure command (#361)
It was removed by accident in #321.
2023-04-26 08:10:35 +02:00
Pieter Noordhuis 81b69094aa
Remove stale command (#352) 2023-04-21 13:39:54 +02:00
shreyas-goenka 375eb1c502
Remove package project (#321)
## Changes
<!-- Summary of your changes that are easy to understand -->

This PR removes the project package and it's dependents in the bricks
repo

## Tests
<!-- How is this tested? -->
2023-04-11 16:59:27 +02:00
Pieter Noordhuis 46cfa747ac
Move and hide launch and test commands (#222)
Semantics in the context of a bundle are not yet clearly defined. Moving
and hiding these commands until then.
2023-03-09 10:26:56 +01:00
Pieter Noordhuis 1c27f081e0
Include build information and add version command (#194)
Includes relevant fields listed on
https://goreleaser.com/customization/templates/ into build artifacts.

The version command outputs the version by default:
```
$ bricks version
0.0.21-devel
```

Or all build information if `--json` is specified:
```
$ bricks version --json
{
  "ProjectName": "bricks",
  "Version": "0.0.21-devel",
  "Branch": "version-info",
  "Tag": "v0.0.20",
  "ShortCommit": "193b56b",
  "FullCommit": "193b56b0929128c0836d35e913c46fd66fa2a93c",
  "CommitTime": "2023-02-02T22:04:42+01:00",
  "Summary": "v0.0.20-5-g193b56b",
  "Major": 0,
  "Minor": 0,
  "Patch": 20,
  "Prerelease": "",
  "IsSnapshot": true,
  "BuildTime": "2023-02-02T22:07:36+01:00"
}
```
2023-02-03 15:38:53 +01:00
Serge Smertin b87b4b0f40
Added `bricks auth login` and `bricks auth token` (#158)
# Auth challenge (happy path)

Simplified description of [PKCE](https://oauth.net/2/pkce/)
implementation:

```mermaid
sequenceDiagram
    autonumber
    actor User
    
    User ->> CLI: type `bricks auth login HOST`
    CLI ->>+ HOST: request OIDC endpoints
    HOST ->>- CLI: auth & token endpoints
    CLI ->> CLI: start embedded server to consume redirects (lock)
    CLI -->>+ Auth Endpoint: open browser with RND1 + SHA256(RND2)

    User ->>+ Auth Endpoint: Go through SSO
    Auth Endpoint ->>- CLI: AUTH CODE + 'RND1 (redirect)

    CLI ->>+ Token Endpoint: Exchange: AUTH CODE + RND2
    Token Endpoint ->>- CLI: Access Token (JWT) + refresh + expiry
    CLI ->> Token cache: Save Access Token (JWT) + refresh + expiry
    CLI ->> User: success
```

# Token refresh (happy path)

```mermaid
sequenceDiagram
    autonumber
    actor User
    
    User ->> CLI: type `bricks token HOST`
    
    CLI ->> CLI: acquire lock (same local addr as redirect server)
    CLI ->>+ Token cache: read token

    critical token not expired
    Token cache ->>- User: JWT (without refresh)

    option token is expired
    CLI ->>+ HOST: request OIDC endpoints
    HOST ->>- CLI: auth & token endpoints
    CLI ->>+ Token Endpoint: refresh token
    Token Endpoint ->>- CLI: JWT (refreshed)
    CLI ->> Token cache: save JWT (refreshed)
    CLI ->> User: JWT (refreshed)
    
    option no auth for host
    CLI -X User: no auth configured
    end
```
2023-01-06 16:15:57 +01:00
Pieter Noordhuis fdb8c97f6b
Exit with non-zero status on errors (#148) 2022-12-21 11:58:51 +01:00
shreyas-goenka d9d295f2a9
Implement Terraform state synchronization and deploy (#98)
https://user-images.githubusercontent.com/88374338/203669797-abebf99e-8fa6-4d6e-b57a-abd172d8020d.mov
2022-12-06 00:40:45 +01:00
Pieter Noordhuis 07f07694a4
Function to return workspace client on bundle.Bundle (#100)
Complementary command to check the identity in the context of a bundle
environment:

For example:
```
bricks bundle debug whoami -e development
```
2022-11-23 15:20:03 +01:00
Pieter Noordhuis 3b351d3b00
Add command that writes the materialized bundle configuration to stdout (#95)
Used to inspect the bundle configuration after loading and merging all
files.

Once we add variable interpolation this command could show the result
after interpolation as well.

Each of the mutations to this configuration is observable, so we could
add a mode that writes each of the intermediate versions to disk for
even more fine grained introspection.
2022-11-21 15:39:53 +01:00
Shreyas Goenka 0c24e6f82e
Revert "WIP initial version of the workspace file lock done"
This reverts commit 02eec1f990.
2022-11-16 23:50:17 +01:00
Shreyas Goenka 02eec1f990
WIP initial version of the workspace file lock done 2022-11-16 17:30:46 +01:00
Pieter Noordhuis 38a9dabcbe
Add command to make API calls (#80)
Not settled whether this should live as a top level command or hidden
under some debug scope. Either way, the ability to make arbitrary API
calls and leverage unified auth is a super useful tool.
2022-10-10 10:27:45 +02:00
Kartik Gupta 457f3ad3c2
Add `bricks configure` command to bricks CLI (#18)
* bricks configure

* remove t.setenv

* Read token and host from stdin

* Update .vscode/testing.code-snippets

Co-authored-by: Serge Smertin <259697+nfx@users.noreply.github.com>

Co-authored-by: Serge Smertin <259697+nfx@users.noreply.github.com>
2022-09-05 20:25:54 +02:00
Serge Smertin 32ae59c1bc Experimental sync command 2022-07-07 20:56:59 +02:00
Serge Smertin 3d3b722eda updated dependencies 2022-05-14 19:56:09 +02:00
Serge Smertin 15fd93a012 Initial commit 2022-05-13 15:30:22 +02:00