Commit Graph

327 Commits

Author SHA1 Message Date
Andrew Nester 1b4a774609
Only set ComputeID value when `--compute-id` flag provided (#1229)
## Changes
Fixes an issue when `compute_id` is defined in the bundle config,
correctly replaced in `validate` command but not used in `deploy`
command

## Tests
Manually
2024-02-22 15:14:06 +00:00
Miles Yucht b65ce75c1f
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes
Currently, when the CLI run a list API call (like list jobs), it uses
the `List*All` methods from the SDK, which list all resources in the
collection. This is very slow for large collections: if you need to list
all jobs from a workspace that has 10,000+ jobs, you'll be waiting for
at least 100 RPCs to complete before seeing any output.

Instead of using List*All() methods, the SDK recently added an iterator
data structure that allows traversing the collection without needing to
completely list it first. New pages are fetched lazily if the next
requested item belongs to the next page. Using the List() methods that
return these iterators, the CLI can proactively print out some of the
response before the complete collection has been fetched.

This involves a pretty major rewrite of the rendering logic in `cmdio`.
The idea there is to define custom rendering logic based on the type of
the provided resource. There are three renderer interfaces:

1. textRenderer: supports printing something in a textual format (i.e.
not JSON, and not templated).
2. jsonRenderer: supports printing something in a pretty-printed JSON
format.
3. templateRenderer: supports printing something using a text template.

There are also three renderer implementations:

1. readerRenderer: supports printing a reader. This only implements the
textRenderer interface.
2. iteratorRenderer: supports printing a `listing.Iterator` from the Go
SDK. This implements jsonRenderer and templateRenderer, buffering 20
resources at a time before writing them to the output.
3. defaultRenderer: supports printing arbitrary resources (the previous
implementation).

Callers will either use `cmdio.Render()` for rendering individual
resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an
iterator. This separate method is needed to safely be able to match on
the type of the iterator, since Go does not allow runtime type matches
on generic types with an existential type parameter.

One other change that needs to happen is to split the templates used for
text representation of list resources into a header template and a row
template. The template is now executed multiple times for List API
calls, but the header should only be printed once. To support this, I
have added `headerTemplate` to `cmdIO`, and I have also changed
`RenderWithTemplate` to include a `headerTemplate` parameter everywhere.

## Tests
- [x] Unit tests for text rendering logic
- [x] Unit test for reflection-based iterator construction.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-21 14:16:36 +00:00
Andrew Nester 5309e0fc2a
Improved error message when no .databrickscfg (#1223)
## Changes
Fixes #1060
2024-02-21 14:15:26 +00:00
shreyas-goenka 5ba0aaa5c5
Add support for UC Volumes to the `databricks fs` commands (#1209)
## Changes
```
shreyas.goenka@THW32HFW6T cli % databricks fs -h
Commands to do file system operations on DBFS and UC Volumes.

Usage:
  databricks fs [command]

Available Commands:
  cat         Show file content.
  cp          Copy files and directories.
  ls          Lists files.
  mkdir       Make directories.
  rm          Remove files and directories.
```

This PR adds support for UC Volumes to the fs commands. The fs commands
for UC volumes work the same as they currently do for DBFS. This is
ensured by running the same test matrix we across both DBFS and UC
Volumes versions of the fs commands.

## Tests
Support for UC volumes is tested by running the same tests as we did
originally for DBFS commands. The tests require a `main` catalog to
exist in the workspace, which does in our test workspaces environments
which have the `TEST_METASTORE_ID` environment variable set.

For the Files API filer, we do the same by running mostly common tests
to ensure the filers for "local", "wsfs", "dbfs" and "files API" are
consistent.

The tests are also made to all run in parallel to reduce the time taken.
To ensure the separation of the tests, each test creates its own UC
schema (for UC volumes tests) or DBFS directories (for DBFS tests).
2024-02-20 16:14:37 +00:00
dependabot[bot] d9f34e6b22
Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 (#1222)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.32.0 to 0.33.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
<p>OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date:
2024-02-19</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eba5c8b3ae"><code>eba5c8b</code></a>
Release v0.33.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/823">#823</a>)</li>
<li><a
href="6846045a98"><code>6846045</code></a>
Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/819">#819</a>)</li>
<li><a
href="c6a803ae18"><code>c6a803a</code></a>
Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/822">#822</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-19 14:30:06 +00:00
Lennart Kats (databricks) 162b115e19
Add an experimental default-sql template (#1051)
## Changes

This adds a `default-sql` template! 

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and properly release the template at
the right time.

- [x] WorkspaceFS support for .sql files is in prod
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-UC workspaces

## Tests
- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [x] Reviewer testing

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
2024-02-19 12:01:11 +00:00
Lennart Kats (databricks) 1c680121c8
Add an experimental dbt-sql template (#1059)
## Changes

This adds a new dbt-sql template. This work requires the new WorkspaceFS
support for dbt tasks.

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and propertly release the template at
the right time.

Blockers:
- [x] WorkspaceFS support for dbt projects is in prod
- [x] Move dbt files into a subdirectory
- [ ] Wait until the next (>1.7.4) release of the dbt plugin which will
have major improvements!
- _Rather than wait, this template is hidden from the list of
templates._
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- MV / streaming tables:
  - [x] Add to template
- [x] Fix https://github.com/databricks/dbt-databricks/issues/535 (to be
released with in 1.7.4)
- [x] Merge https://github.com/databricks/dbt-databricks/pull/338 (to be
released with in 1.7.4)
- [ ] Fix "too many 503 errors" issue
(https://github.com/databricks/dbt-databricks/issues/570, internal
tracker: ES-1009215, ES-1014138)
  - [x] Support ANSI mode in the template
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-workspace-admin deployment
- [x] Make sure `data_security_mode: SINGLE_USER` works on non-UC
workspaces (it's required to be explicitly specified on UC workspaces
with single-node clusters)
- [x] Support non-UC workspaces

## Tests

- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [ ] Reviewer manual testing
  - _I'd like to do a small bug bash post-merging._
- [x] Unit tests
2024-02-19 09:15:17 +00:00
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Andrew Nester e474948a4b
Generate correct YAML if custom_tags or spark_conf is used for pipeline or job cluster configuration (#1210)
These fields (key and values) needs to be double quoted in order for
yaml loader to read, parse and unmarshal it into Go struct correctly
because these fields are `map[string]string` type.

## Tests
Added regression unit and E2E tests
2024-02-15 15:03:19 +00:00
dependabot[bot] 299e9b56a6
Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 (#1199)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.30.1 to 0.32.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-15 14:52:17 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Miles Yucht e8b0698e19
Regenerate the CLI using the same OpenAPI spec as the SDK (#1205)
## Changes
The OpenAPI spec used to generate the CLI doesn't match the version used
for the SDK version that the CLI currently depends on. This PR
regenerates the CLI based on the same version of the OpenAPI spec used
by the SDK on v0.30.1.

## Tests
<!-- How is this tested? -->
2024-02-13 14:33:59 +00:00
Andrew Nester bc30c9ed4a
Added `--restart` flag for `bundle run` command (#1191)
## Changes
Added `--restart` flag for `bundle run` command

When running with this flag, `bundle run` will cancel all existing runs
before starting a new one

## Tests
Manually
2024-02-09 14:33:14 +00:00
shreyas-goenka d638262665
Add spinner when downloading templates for bundle init (#1188)
## Changes
Templates can take a long time to download. This PR adds a spinner to
give feedback to users.

## Tests
<!-- How is this tested? -->
Manually


https://github.com/databricks/cli/assets/88374338/b453982c-3233-40f4-8d6f-f31606ff0195
2024-02-08 12:52:53 +00:00
Pieter Noordhuis a835a3e564
Ignore environment variables for `auth profiles` (#1189)
## Changes

If environment variables related to unified authentication are set and a
user runs `auth profiles`, the environment variables will interfere with
the output. This change only takes profile data into account for the
output.

## Tests

Added a unit test.
2024-02-08 12:25:51 +00:00
Pieter Noordhuis b1b5ad8acd
Log time it takes for profile to load (#1186)
## Changes

Aids debugging why `auth profiles` may take longer than expected.

## Tests

Confirmed manually that timing information shows up in the log output.
2024-02-08 11:10:52 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
Andrew Nester b28432afed
Add `--key` flag for generate commands to specify resource key (#1165)
## Changes
Add --key for generate commands to specify resource key.

Also, resource config files are now not prefixed anymore.

## Tests
Integration tests passed

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-01-31 10:23:35 +00:00
hectorcast-db e3d2dbb0ab
Update Go SDK to v0.30.1 (#1162)
## Changes
Update Go SDK to 0.30.1

## Tests
```
make fmt && make test
```
2024-01-29 15:25:10 +00:00
Andrew Nester 1fb15331c5
Bring back `--json` flag for workspace-conf set-status command (#1151)
## Changes
--json flag was removed from this command when MustUseJson / CanUseJson
generator functions were introduced which did not take requests types of
map.

This PR bring the flag back.

Relies on this Go SDK change:
https://github.com/databricks/databricks-sdk-go/pull/786
2024-01-25 11:55:17 +00:00
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Ilia Babanov 9c3e4fda7c
Add "bundle summary" command (#1123)
The plan is to use the new command in the Databricks VSCode extension to
render "modified" UI state in the bundle resource tree elements, plus
use resource IDs to generate links for the resources

### New revision
- Renamed `remote-state` to `summary`
- Added "modified statuses" to all resources. Currently we don't set
"updated" status - it's either nothing, or created/deleted
- Added tests for the `TerraformToBundle` command
2024-01-25 11:32:47 +00:00
Serge Smertin deb7e67ad5
Prompt for account profile only for account-level command execution instead of during `databricks labs install` flow (#1128)
## Changes

There's a lot of end-user friction for projects that require
account-level commands. This is mainly related to the fact that, as of
January 2024, workspace administrators do not necessarily have access to
call account-level APIs. Ongoing discussions exist on how to implement
this on a platform level best.

A temporary workaround is creating a dummy ~/.databrickscfg profile with
the `account_id` field, though it doesn't remove the end-user friction.
Hence, we don't require an account profile during installation (anymore)
and just prompt it when the context requires it. This also means that we
always prompt for account-level commands unless users specify a
`--profile` flag.

## Tests
- `go run main.go labs install ucx`, don't see an account profile prompt
- `go run main.go labs ucx sync-workspace-info`, to see a profile prompt
and have a valid auth passed
- `go run main.go labs ucx sync-workspace-info --debug --profile
profile-name` to get a concrete profile passed
2024-01-22 17:35:13 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Andrew Nester 98477699a0
Always require path parameters as positional arguments (#1129)
## Changes
Always require path parameters as positional arguments
Note: uses a generator with this SDK change:
https://github.com/databricks/databricks-sdk-go/pull/773

Fixes https://github.com/databricks/cli/issues/1121
2024-01-17 14:14:20 +00:00
Andrew Nester ef67b1755e
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes
Do not require positional arguments if they should be provided in JSON

Fixes #1122
2024-01-17 10:53:50 +00:00
shreyas-goenka 2c0d06715c
Fix windows style file paths in fs cp command (#1118)
## Changes
Copying a local file in windows to remote directory in DBFS would fail
if the path was specified as a windows style path (compared to a UNIX
style path). This PR fixes that.

Note, UNIX style paths will continue to work because `filepath.Base`
respects both `/` and `\` as file separators. See: `IsPathSeparator` in
https://go.dev/src/os/path_windows.go.

Fixes issue: https://github.com/databricks/cli/issues/1109.

## Tests
Integration test and manually
```
C:\Users\shreyas.goenka>Desktop\cli.exe fs cp .\Desktop\foo.txt dbfs:/Users/shreyas.goenka@databricks.com
.\Desktop\foo.txt -> dbfs:/Users/shreyas.goenka@databricks.com/foo.txt

C:\Users\shreyas.goenka>Desktop\cli.exe fs cat  dbfs:/Users/shreyas.goenka@databricks.com/foo.txt
hello, world
````
2024-01-11 18:49:42 +00:00
Pieter Noordhuis 3c76a11d00
Upgrade Go SDK to v0.29.0 (#1111)
## Changes

See:
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.29.0
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.28.0

## Tests

Unit and integration tests pass.
2024-01-11 08:16:25 +00:00
Miles Yucht 8a1be76910
Always log with text format by default (#1096)
## Changes
The JSON logger is excellent as a machine-readable logger with lots of
metadata, but the resulting logs are difficult to read:

<img width="1601" alt="Image_from_Databricks"
src="https://github.com/databricks/cli/assets/1850319/76aa852f-756f-4e0a-bc00-3a6e3224296a">

Currently, we only use the friendly log printer when run from a TTY.
This PR removes that restriction, so logs will be pretty-printed by
default, regardless of TTY or not. If a user needs machine-readable
logs, they can still use `--log-format JSON`.

## Tests
Manual test: `databricks current-user me --debug | cat` uses the
pretty-printing logger.


![Screenshot_02_01_2024__13_12](https://github.com/databricks/cli/assets/1850319/45fd5587-52f6-4864-b7d2-3708ed2ff87f)
2024-01-03 09:39:33 +00:00
Andrew Nester e80882b5af
Allow account client auth with environment variables when no .databrickscfg file present (#1097)
## Changes
Allow account client auth with environment variables when no
.databrickscfg file present

Makes the behaviour to be in line with WorkspaceClient auth.
## Tests
Added regression test
2024-01-02 15:34:43 +00:00
Andrew Nester 3e6e04831f
Fixed storage-credentials list command in text output (#1094)
## Changes
Fixes #1029 
Closes #910 

## Tests
Added regression test
2024-01-02 07:24:51 +00:00
Lennart Kats (databricks) 206b1bf198
Tweak command groups in CLI help (#1092)
## Changes

This tweaks the help output shown when using `databricks help`:
* make`jobs` appears under `Workflows` (as done in baseline OpenAPI). 
* move `bundle` and `sync` under a new group called `Developer Tools`
(similar to what we have in docs)
* minor wording changes
2023-12-28 13:14:55 +00:00
Lennart Kats (databricks) 10a8ce4562
Improve experience for multiple builtin templates (#1052)
## Changes
This enhances the template selection experience a bit as we add more and
more built-in templates (like
https://github.com/databricks/cli/pull/1051 and
https://github.com/databricks/cli/pull/1059):

### New experience:
<img width="661" alt="image"
src="https://github.com/databricks/cli/assets/58432911/afe3b84d-8a77-47f3-b9c2-f827f7893cd7">

### Current experience:
<img width="265" alt="image"
src="https://github.com/databricks/cli/assets/58432911/36f8d568-819f-4920-83b1-fb76109ea3d1">

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2023-12-27 12:03:08 +00:00
Andrew Nester 5526cd3fb2
Added output template for list-secrets command (#1074)
## Changes

Fixes #1067

## Tests

```
andrew.nester@HFW9Y94129 cli % databricks secrets list-secrets "my-test-scope"  --output text         
Key             Last Updated Timestamp
my-secret       1692805686489
my-test-secret  1692767910771
```
2023-12-18 16:09:11 +00:00
Andrew Nester 6dd6899b52
Do not allow input prompts in Git Bash terminal (#1069)
## Changes

Likely due to fact that Git Bash does not correctly support ANSI escape
sequences, we cannot use `promptui` package there. See known issues:

- https://github.com/manifoldco/promptui/issues/208
- https://github.com/chzyer/readline/issues/191
2023-12-18 15:01:59 +00:00
Andrew Nester a6ec9ac08b
Upgrade Go SDK to 0.27.0 (#1064)
## Changes
Upgrade Go SDK to 0.27.0
2023-12-14 08:15:00 +00:00
Serge Smertin 42c06267eb
Stub out Python virtual environment installation for `labs` commands (#1057)
This PR removes 15 seconds from `make test` runtime
2023-12-11 16:30:19 +00:00
shreyas-goenka 6002f49c87
Move bundle schema update to an internal module (#1012)
## Changes

This PR:
1. Move code to load bundle JSON Schema descriptions from the OpenAPI
spec to an internal Go module
2. Remove command line flags from the `bundle schema` command. These
flags were meant for internal processes and at no point were meant for
customer use.
3. Regenerate `bundle_descriptions.json`
4. Add support for `bundle: "deprecated"`. The `environments` field is
tagged as deprecated in this PR and consequently will no longer be a
part of the bundle schema.

## Tests
Tested by regenerating the CLI against its current OpenAPI spec (as
defined in `__openapi_sha`). The `bundle_descriptions.json` in this PR
was generated from the code generator.

Manually checked that the autocompletion / descriptions from the new
bundle schema are correct.
2023-12-06 10:45:18 +00:00
shreyas-goenka a6752a5388
Add list of supported values for flags that represent an enum field (#1036)
## Changes
This PR adds the list of supported values for flags that represent an
enum field in the flag's documentation.
2023-12-06 08:12:17 +00:00
Fabian Jakobs 66e923261d
Ask for host when .databrickscfg doesn't exist (#1041)
## Changes
Ask for host when .databrickscfg doesn't exist

This fixes a regression introduced by
https://github.com/databricks/cli/pull/1003
2023-12-04 15:40:52 +00:00
Pieter Noordhuis 60a8abdcd7
Rewrite the friendly log handler (#1038)
## Changes

It wasn't working because it deferred to the regular `slog.TextHandler`
for the `WithAttr` and `WithGroup` functions. Both of these functions
don't mutate the handler but return a new one. When the top-level logger
called one of these, log records in that context used the standard
handler instead of ours.

To implement tracking of attributes and groups, I followed the guide at
https://github.com/golang/example/blob/master/slog-handler-guide/README.md
for writing custom handlers.

## Tests

The new tests demonstrate formatting through `t.Log` and look good.
2023-12-01 12:17:04 +00:00
shreyas-goenka 76840176e3
Add documentation for positional args in commands generated from the Databricks OpenAPI specification (#1033)
## Changes
This PR adds documentation for positional arguments in commands that are
generated from the openapi spec.

Note: the changes to `.gitattributes` will be revert / properly fixed in
https://github.com/databricks/cli/pull/1012
2023-11-30 16:22:23 +00:00
shreyas-goenka 677926b78b
Fix panic when bundle auth resolution fails (#1002)
## Changes
CLI would panic if an invalid bundle auth is setup when running CLI
commands. This PR removes the panic and shows the error message directly
instead.

## Tests
The CWD is a bundle with:
```
workspace:
  profile: DEFAULT
```

Before:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
panic: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT

goroutine 1 [running]:
```

After:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
Error: cannot resolve bundle auth configuration: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT
```

```
shreyas.goenka@THW32HFW6T bundle-playground % DATABRICKS_CONFIG_FILE=/dev/null cli bundle deploy
Error:  cannot resolve bundle auth configuration: resolve: /dev/null has no DEFAULT profile configured. Config: profile=DEFAULT, config_file=/dev/null. Env: DATABRICKS_CONFIG_FILE
```
2023-11-30 14:28:01 +00:00
Pieter Noordhuis 10c9eca06f
Filter out system clusters for `--configure-cluster` (#1031)
## Changes

Only clusters with their source attribute equal to `UI` or `API` should
be presented in the dropdown.

## Tests

Unit test and manual confirmation.
2023-11-30 09:59:11 +00:00
Pieter Noordhuis 4a228e6f12
Fix `databricks configure` if new profile is specified (#1030)
## Changes

The code included the to-be-created profile in the configuration and
that triggered the SDK to try and load it. Instead, we must use the
specified host and token directly.

## Tests

Manually. More integration test coverage tbd.
2023-11-30 09:51:52 +00:00
Serge Smertin 65458cbde6
Fix `panic: $HOME is not set` (#1027)
This PR adds error to `env.UserHomeDir(ctx)`

Fixes https://github.com/databricks/setup-cli/issues/73

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-11-29 19:08:27 +00:00
Pieter Noordhuis 0cd3bb072d
Bump Go SDK to v0.26.0 (#1019)
## Changes

Bump Go SDK to v0.26.0.

Changelog at
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.26.0.

## Tests

Integration tests pass.
2023-11-29 13:29:31 +00:00
Pieter Noordhuis deb062c489
Fix bug where the account or workspace client could be `nil` (#1020)
## Changes

We didn't return the error upon creating a workspace or account client.
If there is an error, it must always propagate up the stack. The result
of this bug was that we were setting a `nil` account or workspace
client, which in turn caused SIGSEGVs.

Fixes #913.

## Tests

Manually confirmed this fixes the linked issue. The CLI now correctly
returns an error when the client cannot be constructed.

The issue was reproducible using a `.databrickscfg` with a single,
incorrectly configured profile.
2023-11-29 13:29:17 +00:00
shreyas-goenka dd1d540429
Add mlops-stacks to the default `databricks bundle init` prompt (#988)
## Changes
This makes mlops-stacks more discoverable and makes the UX of
initialising the mlops-stack template better.

## Tests
Manually

Dropdown UI:
```
shreyas.goenka@THW32HFW6T projects % cli bundle init
Template to use:
  ▸ default-python
    mlops-stacks
```

Help message:
```
shreyas.goenka@THW32HFW6T bricks % cli bundle init -h
Initialize using a bundle template.

TEMPLATE_PATH optionally specifies which template to use. It can be one of the following:
- default-python: The default Python template
- mlops-stacks: The Databricks MLOps Stacks template. More information can be found at: https://github.com/databricks/mlops-stacks
```
2023-11-28 09:04:06 +00:00
shreyas-goenka 96e9545cf0
Automate the generation of bundle schema descriptions (#1007)
## Changes
This PR makes changes required to automatically update the bundle docs
during the CLI release process. We rely on `post_generate` scripts that
are executed after code generation with CWD as the CLI repo root.

The new `output-file` flag is introduced because stdout redirect does
not work here and would otherwise require changes to our release
automation CLI (deco CLI)

## Tests
Manually. Regenerated the CLI and the descriptions were indeed generated
for the CLI from the provided openapi spec.
2023-11-27 10:42:39 +00:00
Pieter Noordhuis d985601d30
Add `--configure-cluster` flag to configure command (#1005)
## Changes

This breaks out the flags into a separate struct to make it easier to
pass around.

If specified, the flag calls into the `cfgpicker` to prompt for a
cluster to associated with the profile.

## Tests

Existing tests pass; added one for host validation.

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-11-23 19:56:48 +00:00
Miles Yucht 07c4c90772
Tolerate missing .databrickscfg file during `databricks auth login` (#1003)
## Changes
`databricks configure` creates a new .databrickscfg if one doesn't
already exist, but `databricks auth login` fails in this case. Because
`databricks auth login` anyways writes out the config file, we
gracefully handle this error and continue.

## Tests
Unit test.

```
$ ls ~/.databrickscfg*    
/Users/miles/.databrickscfg.bak

$ ./cli auth login
Databricks Profile Name: test
Databricks Host: https://<HOST>
Profile test was successfully saved

$ ls ~/.databrickscfg*
/Users/miles/.databrickscfg     /Users/miles/.databrickscfg.bak

$ cat ~/.databrickscfg
; The profile defined in the DEFAULT section is to be used as a fallback when no profile is explicitly specified.
[DEFAULT]

[test]
host      = https://<HOST>
auth_type = databricks-cli
```
2023-11-23 09:04:54 +00:00
Serge Smertin 1b7558cd7d
Add `databricks labs` command group (#914)
## Command group
<img width="688" alt="image"
src="https://github.com/databricks/cli/assets/259697/51a3d309-2244-40ff-b6c3-4f151da6a6ec">

## Installed versions
<img width="388" alt="image"
src="https://github.com/databricks/cli/assets/259697/0873e8ac-20cc-4bab-bb32-e064eddc05f2">

## Project commands
<img width="479" alt="image"
src="https://github.com/databricks/cli/assets/259697/618949e2-99f1-466b-9288-97e88c715518">

## Installer hook

![image](https://github.com/databricks/cli/assets/259697/3ce0d355-039a-445f-bff7-6dfc1a2e3288)

## Update notifications

![image](https://github.com/databricks/cli/assets/259697/10724627-3606-49e1-9722-00ae37afed12)

# Downstream work

- https://github.com/databrickslabs/ucx/pull/517
- https://github.com/databrickslabs/dlt-meta/pull/19
- https://github.com/databrickslabs/discoverx/pull/84
2023-11-17 12:47:37 +00:00
shreyas-goenka 0c837e5772
Make `file_path` and `artifact_path` fields consistent with json tag (#987)
## Changes
This PR:
1. Renames `FilesPath` -> `FilePath` and `ArtifactsPath` ->
`ArtifactPath` in the bundle and metadata configuration to make them
consistant with the json tags.
2. Fixes development / production mode error messages to point to
`file_path` and `artifact_path`

## Tests
Existing unit tests. This is a strightforward renaming of the fields.
2023-11-15 13:37:26 +00:00
shreyas-goenka a25f10f247
Add `--tag` and `--branch` options to bundle init command (#975)
## Tests
Tested manually. Specified branch / tag are indeed cloned and used by
bundle init.
2023-11-14 22:27:58 +00:00
dependabot[bot] c3ced68c60
Bump github.com/databricks/databricks-sdk-go from 0.24.0 to 0.25.0 (#980)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.24.0 to 0.25.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.25.0</h2>
<ul>
<li>Make sure path parameters are first in order in RequiredFields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/669">#669</a>).</li>
<li>Added Field.IsRequestBodyField method for code generation (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/670">#670</a>).</li>
<li>Added regressions question to the issue template (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/676">#676</a>).</li>
<li>Added telemetry for CI/CD platform to useragent (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/665">#665</a>).</li>
<li>Skiped GCP Integration Tests using Statement Execution API (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/678">#678</a>).</li>
<li>Added more detailed error message on default credentials not found
error (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/679">#679</a>).</li>
<li>Updated SDK to latest OpenAPI Spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/685">#685</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionsAPI">w.Functions</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>InputParams</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li>
<li>Changed <code>Properties</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a>
to <code>string</code>.</li>
<li>Changed <code>ReturnParams</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a></li>
<li>Changed <code>StorageRoot</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateMetastore">catalog.CreateMetastore</a>
to no longer be required.</li>
<li>Added <code>SkipValidation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>Libraries</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#CreatePolicy">compute.CreatePolicy</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EditPolicy">compute.EditPolicy</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Policy">compute.Policy</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptEventDetails">compute.InitScriptEventDetails</a>.</li>
<li>Added <code>InitScripts</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventDetails">compute.EventDetails</a>.</li>
<li>Added <code>File</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfo">compute.InitScriptInfo</a>.</li>
<li>Added <code>ZoneId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InstancePoolGcpAttributes">compute.InstancePoolGcpAttributes</a>.</li>
<li>Added <code>IncludeResolvedValues</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetRunRequest">jobs.GetRunRequest</a>.</li>
<li>Added <code>EditMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>NetworkConnectivityConfigId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/provisioning#UpdateWorkspaceRequest">provisioning.UpdateWorkspaceRequest</a>.</li>
<li>Added <code>ContainerLogs</code> and <code>ExtraInfo</code> field
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunctionRequest">catalog.CreateFunctionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DependencyList">catalog.DependencyList</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetails">compute.InitScriptExecutionDetails</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetailsStatus">compute.InitScriptExecutionDetailsStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfoAndExecutionDetails">compute.InitScriptInfoAndExecutionDetails</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LocalFileInfo">compute.LocalFileInfo</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobEditMode">jobs.CreateJobEditMode</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsEditMode">jobs.JobSettingsEditMode</a>.</li>
<li>Added <code>DeleteApp</code>, <code>GetApp</code>,
<code>GetAppDeploymentStatus</code>, <code>GetApps</code> and
<code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityAPI">a.NetworkConnectivity</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreateNetworkConnectivityConfigRequest">settings.CreateNetworkConnectivityConfigRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequest">settings.CreatePrivateEndpointRuleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequestGroupId">settings.CreatePrivateEndpointRuleRequestGroupId</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeleteNetworkConnectivityConfigurationRequest">settings.DeleteNetworkConnectivityConfigurationRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeletePrivateEndpointRuleRequest">settings.DeletePrivateEndpointRuleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetNetworkConnectivityConfigurationRequest">settings.GetNetworkConnectivityConfigurationRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetPrivateEndpointRuleRequest">settings.GetPrivateEndpointRuleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRule">settings.NccAzurePrivateEndpointRule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleConnectionState">settings.NccAzurePrivateEndpointRuleConnectionState</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleGroupId">settings.NccAzurePrivateEndpointRuleGroupId</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzureServiceEndpointRule">settings.NccAzureServiceEndpointRule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressConfig">settings.NccEgressConfig</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressDefaultRules">settings.NccEgressDefaultRules</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressTargetRules">settings.NccEgressTargetRules</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityConfiguration">settings.NetworkConnectivityConfiguration</a>.</li>
<li>Removed <code>Delete</code>, <code>Get</code>, method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsUiState">jobs.JobSettingsUiState</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobUiState">jobs.CreateJobUiState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentAPI">a.OAuthEnrollment</a>
account-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateOAuthEnrollment">oauth2.CreateOAuthEnrollment</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentStatus">oauth2.OAuthEnrollmentStatus</a>.</li>
<li>Removed <code>UiState</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
</ul>
<p>OpenAPI SHA: e7b127cb07af8dd4d8c61c7cc045c8910cdbb02a, Date:
2023-11-08
Dependency updates:</p>
<ul>
<li>Bump google.golang.org/api from 0.146.0 to 0.150.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/683">#683</a>).</li>
<li>Bump golang.org/x/mod from 0.13.0 to 0.14.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/681">#681</a>).</li>
<li>Bump google.golang.org/grpc from 1.58.2 to 1.58.3 in /examples/slog
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/672">#672</a>).</li>
<li>Bump google.golang.org/grpc to 1.58.3 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/684">#684</a>).</li>
<li>Bump golang.org/x/time from 0.3.0 to 0.4.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/680">#680</a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="03b614c890"><code>03b614c</code></a>
Release v0.25.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/687">#687</a>)</li>
<li><a
href="16970dfeb3"><code>16970df</code></a>
Update SDK to latest OpenAPI Spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/685">#685</a>)</li>
<li><a
href="712fcfed33"><code>712fcfe</code></a>
Bump golang.org/x/time from 0.3.0 to 0.4.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/680">#680</a>)</li>
<li><a
href="3215e790ed"><code>3215e79</code></a>
Bump google.golang.org/grpc from 1.58.2 to 1.58.3 in /examples/zerolog
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/684">#684</a>)</li>
<li><a
href="f06dd7330c"><code>f06dd73</code></a>
Bump google.golang.org/grpc from 1.58.2 to 1.58.3 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/672">#672</a>)</li>
<li><a
href="2bb6c8de98"><code>2bb6c8d</code></a>
Bump google.golang.org/grpc from 1.57.0 to 1.57.1 in /examples/zerolog
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/674">#674</a>)</li>
<li><a
href="e045c01326"><code>e045c01</code></a>
Bump golang.org/x/mod from 0.13.0 to 0.14.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/681">#681</a>)</li>
<li><a
href="7e28d29f91"><code>7e28d29</code></a>
Bump google.golang.org/api from 0.146.0 to 0.150.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/683">#683</a>)</li>
<li><a
href="6e2bb1def4"><code>6e2bb1d</code></a>
Add more detailed error message on default credentials not found error
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/679">#679</a>)</li>
<li><a
href="33c1c8718f"><code>33c1c87</code></a>
Skip GCP Integration Tests using Statement Execution API (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/678">#678</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.24.0...v0.25.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.24.0&new-version=0.25.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2023-11-13 15:38:35 +00:00
Ilia Babanov e82a49b4e9
Make `databricks configure` save only explicit fields (#973)
## Changes
Save only explicit fields to the config file
This applies to two commands: `configure` and `auth login`. 
The latter only pulls env vars in the case of the `--configure-cluster`
flag

## Tests
Manual, plus additional unit test for the `configure` command
2023-11-10 14:03:57 +00:00
Serge Smertin 3284a8c56c
Improved usability of `databricks auth login ... --configure-cluster` flow by displaying cluster type and runtime version (#956)
This PR adds selectors for Databricks-connect compatible clusters and
SQL warehouses

Tested in https://github.com/databricks/cli/pull/914
2023-11-09 16:38:45 +00:00
Serge Smertin d4c0027556
Add `--debug` as shortcut for `--log-level debug` (#964)
## Changes
This PR exposes simpler interfaces to end users.

## Tests
<img width="724" alt="image"
src="https://github.com/databricks/cli/assets/259697/8bd25110-33f0-4197-8f00-2b8198c4aba6">
2023-11-09 10:40:47 +00:00
Andrew Nester 47e434db2f
Improve error message when `--json` flag is specified (#933)
## Changes
Improve error message when --json input is provided

## Tests
```
cli % databricks model-registry create-model mymodel --json @./input.json              
Error: when --json flag is specified, no positional arguments are required. Provide NAME in your JSON input
```
2023-11-08 15:07:29 +00:00
Serge Smertin e68a88e14d
Added `env.UserHomeDir(ctx)` for parallel-friendly tests (#955)
## Changes
`os.Getenv(..)` is not friendly with `libs/env`. This PR makes the
relevant changes to places where we need to read user home directory.

## Tests
Mainly done in https://github.com/databricks/cli/pull/914
2023-11-08 14:50:20 +00:00
Ilia Babanov 65dd9c5c0f
Log process ID in each log entry (#949)
## Changes
This will help differentiate multiple cli commands that write to the
same log file.
Noticed that the root module wasn't using the common log utilities,
refactored it to avoid missing log arguments.
Relevant PR on the databricks vscode extension side:
https://github.com/databricks/databricks-vscode/pull/923

## Tests
Tested manually for sdk and cli loggers
2023-11-08 08:51:01 +00:00
Serge Smertin 7509e4d55a
Hide `--progress-format` global flag (#965)
## Changes
At the moment, these flags are mostly used for VSCode integration for
bundles, but they're not effective for the majority of commands.

## Tests
<img width="559" alt="image"
src="https://github.com/databricks/cli/assets/259697/15ddd322-f746-48ad-b1ce-35c55e664a06">
2023-11-08 08:29:22 +00:00
Andrew Nester f07832746b
Make configure command visible + fix bundle command description (#961)
## Changes
Fixes #936 #937
2023-11-08 07:33:40 +00:00
shreyas-goenka b6aa4631f1
Fix metadata computation for empty bundle (#939)
## Changes
This PR fixes metadata computation for empty bundle. Before we would
error because the `terraform.Load()` mutator errors on a empty / no
state file.

## Tests
Failing integration tests now pass.
2023-11-02 11:00:30 +00:00
Taiga Matsumoto e408b701ac
Add override to support YAML inputs for apps (#921)
## Changes
<!-- Summary of your changes that are easy to understand -->

Take @andrefurlan-db 's original
[commit](https://github.com/databricks/cli/compare/databricks:6e21ced...andrefurlan-db:12ed10c)
to add `apps` support to the CLI and add the yaml file-support as an
override (the apps routes are already apart of the Go SDK and are
available for use in the CLI)

**NOTE: this feature is still private preview. CLI usage will be
internal only**

## Tests
<!-- How is this tested? -->
2023-10-27 18:57:26 +00:00
Miles Yucht 9b16e9bd45
Bump the Go SDK in the CLI (#919)
## Changes
Bump the Databricks Go SDK version from v0.23.0 to v0.24.0.

## Tests
<!-- How is this tested? -->
2023-10-26 11:41:28 +00:00
Andrew Nester d768994bbf
Simplified code generation logic for handling path and request body parameters and JSON input (#905)
## Changes
Simplified code generation logic for handling path and request body
parameters and JSON input

Note: relies on these PRs: 
https://github.com/databricks/databricks-sdk-go/pull/666
https://github.com/databricks/databricks-sdk-go/pull/669
https://github.com/databricks/databricks-sdk-go/pull/670
2023-10-24 17:37:08 +00:00
shreyas-goenka 9f2d2b964f
Fix URL for bundle template documentation (#903)
## Changes
The doc URL link went stale. This PR updates the URL to the correct one.

Fixes https://github.com/databricks/cli/issues/899
2023-10-23 12:31:31 +00:00
Pieter Noordhuis d4be40520c
Resolve configuration before performing verification (#890)
## Changes

If a bundle configuration specifies a workspace host, and the user
specifies a profile to use, we perform a check to confirm that the
workspace host in the bundle configuration and the workspace host from
the profile are identical. If they are not, we return an error. The
check was introduced in #571.

Previously, the code included an assumption that the client
configuration was already loaded from the environment prior to
performing the check. This was not the case, and as such if the user
intended to use a non-default path to `.databrickscfg`, this path was
not used when performing the check.

The fix does the following:
* Resolve the configuration prior to performing the check.
* Don't treat the configuration file not existing as an error.
* Add unit tests.

Fixes #884.

## Tests

Unit tests and manual confirmation.
2023-10-20 13:10:31 +00:00
Andrew Nester 4ad68eb314
Fixed requiring positional arguments for API URL parameters (#878)
## Changes
Some commands such as update commands have an argument in their url, for
example in pipeline we have `PUT pipelines/<id>` to update the pipeline.

Such parameters must be required and respected even if `--json` flag
with the payload passed.

Note: this depends on these PRs in Go SDK:
 https://github.com/databricks/databricks-sdk-go/pull/660
https://github.com/databricks/databricks-sdk-go/pull/661

## Tests
Manually running `databricks pipelines update`
2023-10-19 14:19:17 +00:00
Pieter Noordhuis 7139487c2f
Never load authentication configuration from bundle for sync command (#889)
## Changes

This is used for the sync command, where we need to ensure that a bundle
configuration never taints the authentication setup as prepared in the
environment (by our VS Code extension). Once the VS Code extension fully
builds on bundles, we can remove this check again.

## Tests

Manually confirmed that calling `databricks sync` from a bundle
directory no longer picks up its authentication configuration.
2023-10-19 12:50:46 +00:00
Pieter Noordhuis 5a53b118a7
Skip prompt on completion hook (#888)
## Changes

The first stab at this was added in #837 but only included the
`NoPrompt` check in `MustAccountClient`. I renamed it to `SkipPrompt`
(in preparation for another option that skips bundle load) and made it
work for `MustWorkspaceClient` as well.

## Tests

Manually confirmed that the completion hook no longer prompts for a
profile (when called directly with `databricks __complete`).
2023-10-19 12:34:20 +00:00
Lennart Kats (databricks) a2ee8bb45b
Improve the output of the `databricks bundle init` command (#795)
Improve the output of help, prompts, and so on for `databricks bundle
init` and the default template.

Among other things, this PR adds support for a new `welcome_message`
property that lets a template print a custom message on success:

```
$ databricks bundle init
Template to use [default-python]:
Unique name for this project [my_project]: lennart_project
Include a stub (sample) notebook in 'lennart_project/src': yes
Include a stub (sample) Delta Live Tables pipeline in 'lennart_project/src': yes
Include a stub (sample) Python package in 'lennart_project/src': yes

 Your new project has been created in the 'lennart_project' directory!

Please refer to the README.md of your project for further instructions on getting started.
Or read the documentation on Databricks Asset Bundles at https://docs.databricks.com/dev-tools/bundles/index.html.
```

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2023-10-19 07:08:36 +00:00
Arpit Jasapara 1b992c0c1c
Rename MLOps Stack to MLOps Stacks (#881)
## Changes
<!-- Summary of your changes that are easy to understand -->
Rename `mlops-stack` `bundle init` redirect to `mlops-stacks`.

## Tests
<!-- How is this tested? -->
N/A
2023-10-18 08:53:01 +00:00
Pieter Noordhuis 4bf32cb666
Fix rendering of streaming response (#876)
## Changes

The update to the Go SDK v0.23.0 in #772 included a change to make the
billable usage API return its streaming response. This still did not
make the command print out the CSV returned by the API, however. To do
so, we call `cmdio.RenderReader` in case the response is a byte stream.

Note: there is an opportunity to parse the CSV and return JSON if
requested, but that is out of scope for this PR (it is a rather big
customization of the command).

Fixes #574.

## Tests

Manually confirmed that `databricks account billable-usage download` now
returns CSV.
2023-10-17 10:07:55 +00:00
shreyas-goenka b2cb691988
Add alias for mlops-stack template URL (#869)
## Changes
Allows users to initialize mlops-stack by running `bundle init
mlops-stack`

## Tests
Manually

```
shreyas.goenka@THW32HFW6T playground % cli bundle init            
Template to use [default-python]: mlops-stack
Project Name [my-mlops-project]: ^C
shreyas.goenka@THW32HFW6T playground % cli bundle init mlops-stack
Project Name [my-mlops-project]: ^C
```
2023-10-16 08:36:01 +00:00
hectorcast-db 36f30c8b47
Update Go SDK to 0.23.0 and use custom marshaller (#772)
## Changes
Update Go SDK to 0.23.0 and use custom marshaller.
## Tests
* Run unit tests

* Run nightly

* Manual test:
```
./cli jobs create --json @myjob.json
```
with 
```
{
    "name": "my-job-marshal-test-go",
    "tasks": [{
        "task_key": "testgomarshaltask",
        "new_cluster": {
            "num_workers": 0,
            "spark_version": "10.4.x-scala2.12",
            "node_type_id": "Standard_DS3_v2"
        },
        "libraries": [
            {
                "jar": "dbfs:/max/jars/exampleJarTask.jar"
            }
        ],
        "spark_jar_task": {
            "main_class_name":  "com.databricks.quickstart.exampleTask"
        }
    }]
}
```
Main branch:
```
Error: Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size
```
This branch:
```
{
  "job_id":<jobid>
}
```

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-10-16 06:56:06 +00:00
Andrew Nester ff01898b61
Use already instantiated WorkspaceClient in sync command (#867)
## Changes
Since we use `root.MustWorkspaceClient` now, we should use already
initialised version of WorkspaceClient instead of instantiating a new
one.

Fixes #836
2023-10-13 13:04:15 +00:00
hectorcast-db 77101c9b85
Use profile information when getting a token using the CLI (#855)
## Changes
Use stored profile information when the user provides the profile flag
when using the `databricks auth token` command.

## Tests
Run the command with and without the profile flag

```
./cli auth token
Databricks Host: https://e2-dogfood.staging.cloud.databricks.com/
{
  "access_token": "****",
  "token_type": "Bearer",
  "expiry": "2023-10-10T14:24:11.85617+02:00"
}%

./cli auth token --profile DEFAULT
{
  "access_token": "*****",
  "token_type": "Bearer",
  "expiry": "2023-10-10T14:24:11.85617+02:00"
}%

./cli auth token https://e2-dogfood.staging.cloud.databricks.com/
{
  "access_token": "*****",
  "token_type": "Bearer",
  "expiry": "2023-10-11T09:24:55.046029+02:00"
}%   

./cli auth token --profile DEFAULT https://e2-dogfood.staging.cloud.databricks.com/
Error: providing both a profile and a host parameters is not supported
```
2023-10-11 11:12:18 +00:00
Andrew Nester ad4b476270
Ensure profile flag is respected for sync command (#837)
## Changes
Fixes #836 

## Tests
Manually running `sync` command with and without the flag

Integration tests pass as well

```
--- PASS: TestAccSyncFullFileSync (13.38s)
PASS
coverage: 39.1% of statements in ./...
ok      github.com/databricks/cli/internal      14.148s coverage: 39.1% of statements in ./...


--- PASS: TestAccSyncIncrementalFileSync (11.38s)
PASS
coverage: 39.1% of statements in ./...
ok      github.com/databricks/cli/internal      11.674s coverage: 39.1% of statements in ./...
```
2023-10-09 10:37:18 +00:00
shreyas-goenka 1e9dbcfa2a
Add `--file` flag to workspace export command (#794)
This PR:
1. Adds the `--file` flag to the workspace export command. This allows
you to specify a output file to write to.
2. Adds e2e integration tests for the workspace export command
2023-10-05 13:20:33 +00:00
shreyas-goenka caade735e3
Improve `workspace import` command by allowing references to local files for content (#793)
## Changes
This PR makes a few really important QOL improvements to the `workspace
import` command.

They are:
1. Adds the `--file` flag, which allows a user to specify a file to read
the content from.
2. Wraps the most common error first time users of this command will run
into with a helpful hint.
3. Minor changes to the command Use string changing `PATH` ->
`TARGET_PATH`


## Tests
Integration tests. The newly added integration tests that check the
--file flag works as expected for both `SOURCE` and `AUTO` format.

Skipped the other formats because the API behaviour for them is
straightforward.
2023-10-05 12:48:59 +00:00
Andrew Nester e1d1e95525
Updated Go SDK to 0.22.0 (#831)
## Changes
Updated Go SDK to 0.22.0
2023-10-03 11:46:16 +00:00
Andrew Nester aa9c2a1eab
Prompt for profile only in interactive mode (#788)
## Changes
Do not prompt for profiles if not in interactive mode

## Tests
Running sample Go code

```
cmd := exec.Command("databricks", "auth", "login", "--host", "***")
out, err := cmd.CombinedOutput()
```
Before the change
```
Error: ^D

exit status 1
```

After
```
No error (empty output)
```
2023-09-21 12:38:45 +00:00
shreyas-goenka 21ff71ceea
Add documentation link bundle command group description (#770)
Help output:
```
shreyas.goenka@THW32HFW6T ~ % cli bundle -h
Databricks Asset Bundles.
Documentation URL: https://docs.databricks.com/en/dev-tools/bundles.

Usage:
  databricks bundle [command]
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-12 13:38:43 +00:00
Pieter Noordhuis 3cb74e72a8
Run environment related tests in a pristine environment (#769)
## Changes

If the caller running the test has one or more environment variables
that are used in the test already set, they can interfere and make tests
fail.

## Tests

Ran tests in `./cmd/root` with Databricks related environment variables
set.
2023-09-12 13:28:53 +00:00
Pieter Noordhuis a2775f836f
Use interactive prompt to select resource to run if not specified (#762)
## Changes

Display an interactive prompt with a list of resources to run if one
isn't specified and the command is run interactively.

## Tests

Manually confirmed:
* The new prompt works
* Shell completion still works
* Specifying a key argument still works
2023-09-11 18:03:12 +00:00
Pieter Noordhuis 0cb05d1ded
Prompt once for a client profile (#727)
## Changes

The previous implementation ran the risk of infinite looping for the
account client due to a mismatch in determining what constitutes an
account client between the CLI and SDK (see
[here](83443bae8d/libs/databrickscfg/profiles.go (L61))
and
[here](0fdc5165e5/config/config.go (L160))).

Ultimately, this code must never infinite loop. If a user is prompted
and selects a profile that cannot be used, they should receive that
feedback immediately and try again, instead of being prompted again.

Related to #726.

## Tests
<!-- How is this tested? -->
2023-09-11 15:32:24 +00:00
shreyas-goenka ad84abf415
Fix temporary directory cleanup for init repository downloading (#760)
## Changes
This PR fixes a bug where the temp directory created to download the
template would not be cleaned up.

## Tests
Tested manually. The exact process is described in a comment below.
2023-09-11 10:22:05 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
Andrew Nester a41b9e8bf2
Added description for version command (#737)
## Changes
Added description for version command

## Tests
```
databricks help

...

Additional Commands:
  account              Databricks Account Commands
  api                  Perform Databricks API call
  auth                 Authentication related commands
  bundle               Databricks Asset Bundles
  completion           Generate the autocompletion script for the specified shell
  fs                   Filesystem related commands
  help                 Help about any command
  sync                 Synchronize a local directory to a workspace directory
  version              Retrieve information about the current version of CLI
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 08:41:47 +00:00
Lennart Kats (databricks) e533f9109a
Show 'databricks bundle init' template in CLI prompt (#725)
~(this should be changed to target `main`)~

This reveals the template from
https://github.com/databricks/cli/pull/686 in CLI prompts for once #686
and #708 are merged.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 13:57:01 +00:00
Lennart Kats (databricks) 8c2cc07f7b
databricks bundle init template v1 (#686)
## Changes

This adds a built-in "default-python" template to the CLI. This is based
on the new default-template support of
https://github.com/databricks/cli/pull/685.

The goal here is to offer an experience where customers can simply type
`databricks bundle init` to get a default template:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
 Successfully initialized template
```

The present template:
- [x] Works well with VS Code
- [x] Works well with the workspace
- [x] Works well with DB Connect
- [x] Uses minimal stubs rather than boiler-plate-heavy examples

I'll have a followup with tests + DLT support.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 11:58:34 +00:00
Pieter Noordhuis f62def3e77
Replace API call to test configuration with dummy authenticate call (#728)
## Changes

This reduces the latency of every workspace command by the duration of a
single API call to retrieve the current user (which can take up to a
full second).

Note: the better place to verify that a request can be authenticated is
the SDK itself.

## Tests

* Unit test to confirm an the empty `*http.Request` can be constructed
* Manually confirmed that the additional API call no longer happens
2023-09-05 11:10:37 +00:00
Pieter Noordhuis 7a130a3e6e
Group permission related commands (#730)
## Changes

Before:
```
Usage:
  databricks instance-pools [command]

Available Commands:
  create                Create a new instance pool.
  delete                Delete an instance pool.
  edit                  Edit an existing instance pool.
  get                   Get instance pool information.
  get-permission-levels Get instance pool permission levels.
  get-permissions       Get instance pool permissions.
  list                  List instance pool info.
  set-permissions       Set instance pool permissions.
  update-permissions    Update instance pool permissions.
```

After:
```
Usage:
  databricks instance-pools [command]

Available Commands
  create                Create a new instance pool.
  delete                Delete an instance pool.
  edit                  Edit an existing instance pool.
  get                   Get instance pool information.
  list                  List instance pool info.

Permission Commands
  get-permission-levels Get instance pool permission levels.
  get-permissions       Get instance pool permissions.
  set-permissions       Set instance pool permissions.
  update-permissions    Update instance pool permissions.
```

## Tests

Manual.
2023-09-05 09:58:45 +00:00
Pieter Noordhuis 1752e29885
Update Go SDK to v0.19.0 (#729)
## Changes

* Update Go SDK to v0.19.0
* Update commands per OpenAPI spec from Go SDK
* Incorporate `client.Do()` signature change to include a (nil) header
map
* Update `workspace.WorkspaceService` mock with permissions methods
* Skip `files` service in codegen; already implemented under the `fs`
command

## Tests

Unit and integration tests pass.
2023-09-05 09:43:57 +00:00
Lennart Kats (databricks) 707fd6f617
Cleanup after "Add a foundation for built-in templates" (#707)
## Changes
Add some cleanup based on @pietern's comments on
https://github.com/databricks/cli/pull/685
2023-08-30 14:01:08 +00:00