Commit Graph

21 Commits

Author SHA1 Message Date
shreyas-goenka 6430d23453
Print y/n options when displaying prompts using cmdio.Ask (#650)
## Changes
Adds `[y/n]` in `cmdio.Ask` to make the options obvious in all question
prompts

## Tests
Test manually. Works.
2023-08-09 09:22:42 +00:00
Pieter Noordhuis db6313e99c
Fix secrets put-secret command (#545)
## Changes

Two issues with this command:
* The command line arguments for the secret value were ignored
* If the secret value was piped through stdin, it would still prompt

The second issue prevented users from using multi-line strings because
the prompt reads until end-of-line.

This change adds testing infrastructure for:
* Setting up a workspace focused test (common between many tests)
* Running a snippet of Python through the command execution API

Porting more integration tests to use this infrastructure will be done
in later commits.

## Tests

New integration test passes.

The interactive path cannot be integration tested just yet.
2023-07-05 17:30:54 +02:00
Pieter Noordhuis f64c44285f
Decode contents by default in workspace export command (#531)
## Changes

Also see #525.

The direct download flag has been removed in newer versions because of
the content type issue.

Instead, we can make the command decode the base64 output when the
output mode is text.

```
$ databricks workspace export /some/path/script.sh
#!/bin/bash
echo "this is a script"
```

## Tests

New integration test.
2023-06-27 20:42:29 +02:00
shreyas-goenka f2a2d058d1
Remove \r from new line print statments (#509)
## Changes
Removes carriage character from new line prints for json output mode and
sync events

## Tests
Manually
2023-06-22 13:47:52 +02:00
Pieter Noordhuis b9406efd27
Update configure command (#482)
## Changes

This now uses:
* libs/cmdio to determine interactivity and perform prompting
* libs/databrickscfg to persist the profile

It loads a config.Config structure from the environment just like we do
for unified authentication. It is therefore possible to specify both the
host and token with environment variables.

## Tests

```
pieter.noordhuis@L4GHXDT29P /tmp % export DATABRICKS_CONFIG_FILE=.databrickscfg
pieter.noordhuis@L4GHXDT29P /tmp % databricks configure
Databricks Host: https://foo.bar
Personal Access Token: *****
pieter.noordhuis@L4GHXDT29P /tmp % cat .databrickscfg
[DEFAULT]
host  = https://foo.bar
token = token
pieter.noordhuis@L4GHXDT29P /tmp % echo token | databricks configure
Error: host must be set in non-interactive mode
pieter.noordhuis@L4GHXDT29P /tmp % echo token | databricks configure --host foo
Error: must start with https://
pieter.noordhuis@L4GHXDT29P /tmp % echo token | databricks configure --host https://foo
pieter.noordhuis@L4GHXDT29P /tmp % cat .databrickscfg
[DEFAULT]
host  = https://foo
token = token
pieter.noordhuis@L4GHXDT29P /tmp % cat .databrickscfg
pieter.noordhuis@L4GHXDT29P /tmp % databricks configure --host https://foo
Personal Access Token: ******
pieter.noordhuis@L4GHXDT29P /tmp % cat .databrickscfg
[DEFAULT]
host  = https://foo
token = token2
```
2023-06-15 12:50:19 +00:00
Pieter Noordhuis e4415bfbcf
Tweak profile prompt (#454)
## Changes

This includes the following changes:
* Move profile loading code to libs/databrickscfg and add tests
* Update prompt label to reflect workspace/account profiles
* Start prompt in search mode by default
* Custom error if `~/.databrickscfg` doesn't exist
* Custom error if `~/.databrickscfg` doesn't contain profiles
* Use stderr for prompt so that stdout redirection works (e.g. with `jq` or `jless`)

## Tests

* New unit tests pass
* Manual tests for both workspace and account commands
* Search-by-default is really nice if you have many profiles
2023-06-09 13:56:35 +02:00
shreyas-goenka 4818541062
Add workspace export-dir command (#449)
## Changes
This PR:
1. Adds the export-dir command
2. Changes filer.Read to return an error if a user tries to read a
directory
3. Adds returning internal file structures from filer.Stat().Sys()

## Tests
Integration tests and manually
2023-06-08 18:15:12 +02:00
shreyas-goenka 53164ae880
Add new line to cmdio JSON rendering (#443)
## Changes
This PR adds a new line break to JSON rendering using cmdio. This is
useful when we call `cmdio.Render` multiple times

## Tests
Manually

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-06-08 15:48:51 +02:00
shreyas-goenka ae10419eb8
Add fs cat command for dbfs files (#430)
## Changes
TSIA

## Tests
Manually and integration tests
2023-06-06 01:16:23 +02:00
shreyas-goenka 6ff00122ad
Add fs ls command for dbfs (#429)
## Changes
1. Adds fs ls command
2. Adds ability to define multiple templates

## Tests
Manually and integration tests
2023-06-05 17:41:30 +02:00
Andrew Nester 1f130f3722
Do not use FgWhite and FgBlack for terminal output (#435)
## Changes
Using white / black color for terminal output will lead to poorly
displayed content in either light or dark terminal backgrounds. Some
other CLIs experienced same issues
(https://github.com/qri-io/qri/pull/774)

Instead, let's just use color to highlight some of the output so it's
more compatible with different background styles

## Tests
<img width="772" alt="Screenshot 2023-06-05 at 16 05 09"
src="https://github.com/databricks/cli/assets/2969996/01790239-6a33-4059-86a8-d5117ea0b75f">

---

<img width="757" alt="Screenshot 2023-06-05 at 16 05 20"
src="https://github.com/databricks/cli/assets/2969996/ea3b9fdc-3782-4f4f-a9df-19e66af0c04f">
2023-06-05 17:30:40 +02:00
Serge Smertin 24ebfdf31e
Add readable console logger (#370)
## Changes

Add a readable colored console logger that is active only for TTYs:

<img width="764" alt="image"
src="https://user-images.githubusercontent.com/259697/235221427-ca482b32-9f88-4adb-ada3-8c4f35f50f06.png">

## Tests

Run `go run main.go clusters list --log-level debug --profile demo`
2023-06-01 11:37:33 +02:00
Andrew Nester 05eaf7ff50
Added secrets input prompt for secrets put-secret command (#413)
## Changes
Added secrets input prompt for secrets put-secrets command

## Tests

<img width="623" alt="Screenshot 2023-05-30 at 12 06 24"
src="https://github.com/databricks/cli/assets/2969996/9338e6ba-c504-48cc-ac97-cac97dde7a3a">
2023-05-31 10:18:29 +02:00
Pieter Noordhuis 98ebb78c9b
Rename bricks -> databricks (#389)
## Changes

Rename all instances of "bricks" to "databricks".

## Tests

* Confirmed the goreleaser build works, uses the correct new binary
name, and produces the right archives.
* Help output is confirmed to be correct.
* Output of `git grep -w bricks` is minimal with a couple changes
remaining for after the repository rename.
2023-05-16 18:35:39 +02:00
Serge Smertin 4c4a293015
Added OpenAPI command coverage (#357)
This PR adds the following command groups:

## Workspace-level command groups

 * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
 * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
 * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
 * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
 * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
 * `bricks dashboards` - In general, there is little need to modify dashboards using the API.
 * `bricks data-sources` - This API is provided to assist you in making new query objects.
 * `bricks experiments` - MLflow Experiment tracking.
 * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
 * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
 * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
 * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
 * `bricks grants` - In Unity Catalog, data is secure by default.
 * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
 * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
 * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
 * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
 * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
 * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
 * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
 * `bricks model-registry` - MLflow Model Registry commands.
 * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
 * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
 * `bricks policy-families` - View available policy families.
 * `bricks providers` - Databricks Providers REST API.
 * `bricks queries` - These endpoints are used for CRUD operations on query definitions.
 * `bricks query-history` - Access the history of queries through SQL warehouses.
 * `bricks recipient-activation` - Databricks Recipient Activation REST API.
 * `bricks recipients` - Databricks Recipients REST API.
 * `bricks repos` - The Repos API allows users to manage their git repos.
 * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
 * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
 * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
 * `bricks shares` - Databricks Shares REST API.
 * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
 * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
 * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
 * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
 * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
 * `bricks users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
 * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
 * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
 * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.

## Account-level command groups

 * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
 * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
 * `bricks account credentials` - These APIs manage credential configurations for this workspace.
 * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
 * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
 * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
 * `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
 * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
 * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
 * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
 * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
 * `bricks account private-access` - These APIs manage private access settings for this account.
 * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
 * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
 * `bricks account storage` - These APIs manage storage configurations for this workspace.
 * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
 * `bricks account users` - User identities recognized by Databricks and represented by email addresses.
 * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
 * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
 * `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 13:06:16 +02:00
shreyas-goenka 43bc9a0d9d
Use cmdio logger to log bricks cmd execution errors (#348)
## Changes
Uses the cmdio logger to log the execution error

## Tests
Manually by making the root command return fake errors. Here is the
output:
```
shreyas.goenka@THW32HFW6T bricks % bricks bundle validate
Error: my foo error
```

```
shreyas.goenka@THW32HFW6T bricks % bricks bundle validate --progress-format=json
{
  "error": "my foo error"
}
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-04-24 12:11:52 +02:00
shreyas-goenka ddc0237468
Error out if question prompts are used in json mode (#340)
## Changes
This PR disallows questions in json mode

## Tests
Manually and unit test
```
shreyas.goenka@THW32HFW6T job-output % bricks bundle destroy --progress-format=json
The following resources will be removed:
{
  "resource_type": "databricks_job",
  "action": "delete",
  "resource_name": "foo"
}
Error: question prompts are not supported in json mode
```
2023-04-18 17:13:49 +02:00
shreyas-goenka 598ad62688
Log mutator messages using progress logger (#312)
This PR uses progress logger to log messages inside mutators
2023-04-18 16:55:06 +02:00
shreyas-goenka 85889dffb1
Move state to event for whether they support inplace progress logging (#339)
## Changes
Adds a IsInplaceSupported() function to the event interface. Any event
that now uses the progress logger has to declare whether they support in
place logging

## Tests
Manually
2023-04-18 14:20:35 +02:00
shreyas-goenka b9c68b4bd5
Fix wrap around issues with inplace logging (#334)
## Changes
We deal with wraparounds for long lines of text in a bad way. This PR
fixes that by saving the cursor position

## Tests
Manually
2023-04-14 13:06:04 +02:00
shreyas-goenka 4871f7bc8a
Add bundle destroy command (#300)
Adds bundle destroy capability to bricks
2023-04-06 12:54:58 +02:00