Commit Graph

208 Commits

Author SHA1 Message Date
Andrew Nester f7a76ff5d8
Fixed processing jobs libraries with remote path (#638)
## Changes
Some library paths such as for Spark jobs, can reference a lib on remote
path, for example DBFS.
This PR fixes how CLI handles such libraries and do not report them as
missing locally.

## Tests
Added unit tests + ran `databricks bundle deploy` manually
2023-08-07 09:55:30 +00:00
shreyas-goenka ce9c9148c9
Regenerate bundle resource structs from latest terraform provider (#633)
## Changes
This PR:
1. Regenerates the terraform provider structs based off the latest
terraform provider version: 1.22.0
2. Adds a debug launch configuration for regenerating the schema

## Tests
Existing unit tests
2023-08-03 11:20:30 +00:00
Kartik Gupta 3140a8feef
Initialise a empty default bundle if BUNDLE_ROOT and DATABRICKS_BUNDLE_INCLUDES env vars are present (#604)
## Changes
<!-- Summary of your changes that are easy to understand -->

## Tests
<!-- How is this tested? -->
2023-08-02 17:22:47 +00:00
Kartik Gupta 31b178ad6c
Add DATABRICKS_BUNDLE_INCLUDE_PATHS to specify include paths through env vars (#591)
## Changes
* This PR adds `DATABRICKS_BUNDLE_INCLUDE_PATHS` environment variable,
so that we can specify including bundle config files, which we do not
want to commit. These could potentially be local dev overrides or
overrides by our tools - like the VS Code extension
* We always add these include paths to the "include" field. 
## Tests
* [x] Unit tests
2023-08-02 10:18:19 +00:00
shreyas-goenka 5df8935de4
Add JSON schema validation for input template parameters (#598)
## Changes

This PR:
1. Adds code for reading template configs and validating them against a
JSON schema.
2. Moves the json schema struct in `bundle/schema` to a separate library
package. This struct is now reused for validating template configs.

## Tests
Unit tests
2023-08-01 14:09:27 +00:00
Lennart Kats (databricks) 433f401c83
Add validation for Git settings in bundles (#578)
## Changes

This checks whether the Git settings are consistent with the actual Git
state of a source directory.

(This PR adds to https://github.com/databricks/cli/pull/577.) 

Previously, we would silently let users configure their Git branch to
e.g. `main` and deploy with that metadata even if they were actually on
a different branch.

With these changes, the following config would result in an error when
deployed from any other branch than `main`:

```
bundle:
  name: example

workspace:
  git:
    branch: main

environments:
  ...
```

> not on the right Git branch:
>   expected according to configuration: main
>   actual: my-feature-branch

It's not very useful to set the same branch for all environments,
though. For development, it's better to just let the CLI auto-detect the
right branch. Therefore, it's now possible to set the branch just for a
single environment:

```
bundle:
  name: example 2

environments:
  development:
    default: true

  production:
    # production can only be deployed from the 'main' branch
    git:
      branch: main
```

Adding to that, the `mode: production` option actually checks that users
explicitly set the Git branch as seen above. Setting that branch helps
avoid mistakes, where someone accidentally deploys to production from
the wrong branch. (I could see us offering an escape hatch for that in
the future.)

# Testing

Manual testing to validate the experience and error messages. Automated
unit tests.

---------

Co-authored-by: Fabian Jakobs <fabian.jakobs@databricks.com>
2023-07-30 12:44:33 +00:00
Lennart Kats (databricks) d55652be07
Extend deployment mode support (#577)
## Changes

This adds `mode: production` option. This mode doesn't do any
transformations but verifies that an environment is configured correctly
for production:

```
environments:
  prod:
    mode: production

    # paths should not be scoped to a user (unless a service principal is used)
    root_path: /Shared/non_user_path/...

    # run_as and permissions should be set at the resource level (or at the top level when that is implemented)
    run_as:
      user_name: Alice
    permissions:
    - level: CAN_MANAGE
      user_name: Alice
```

Additionally, this extends the existing `mode: development` option,
* now prefixing deployed assets with `[dev your.user]` instead of just
`[dev`]
* validating that development deployments _are_ scoped to a user

## Related

https://github.com/databricks/cli/pull/578/files (in draft)

## Tests

Manual testing to validate the experience, error messages, and
functionality with all resource types. Automated unit tests.

---------

Co-authored-by: Fabian Jakobs <fabian.jakobs@databricks.com>
2023-07-30 07:19:49 +00:00
Andrew Nester 12bba17743
Added support for build command chaining and error on missing wheel (#607)
## Changes
Added support for build command chaining and error on missing wheel
2023-07-26 12:58:52 +00:00
Andrew Nester cfff140815
Auto detect Python wheel packages and infer build command (#603) 2023-07-26 10:07:26 +00:00
Andrew Nester 5e0a096722
Fixed python wheel test (#608)
## Changes
Fixed python wheel test

## Tests
<!-- How is this tested? -->
2023-07-26 11:02:17 +02:00
Andrew Nester 9a88fa602d
Added support for artifacts building for bundles (#583)
## Changes
Added support for artifacts building for bundles. 

Now it allows to specify `artifacts` block in bundle.yml and define a
resource (at the moment Python wheel) to be build and uploaded during
`bundle deploy`

Built artifact will be automatically attached to corresponding job task
or pipeline where it's used as a library

Follow-ups:
1. If artifact is used in job or pipeline, but not found in the config,
try to infer and build it anyway
2. If build command is not provided for Python wheel artifact, infer it
2023-07-25 13:35:08 +02:00
shreyas-goenka fa37449f1f
Require include glob patterns to be explicitly defined (#602)
## Changes
Before this PR we would load all yaml files matching * and \*/\*.yml
files as bundle configurations. This was problematic since this would
also load yaml files that were not meant to be a part of the bundle

## Tests
Manually, now files are no longer included unless manually specified
2023-07-25 10:00:46 +02:00
Fabian Jakobs adab9aa5d7
Add support for more SDK config options (#587)
## Changes
Add support for more SDK config options
2023-07-19 14:06:58 +02:00
dependabot[bot] 65d8fe13e9
Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 (#585)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.12.0 to 0.13.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.13.0</h2>
<ul>
<li>Add issue templates (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li>
<li>Added HasRequiredNonBodyField method (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li>
<li>Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li>
<li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in
OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li>
<li>Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li>
<li>Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li>
<li>Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li>
<li>Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Removed <code>Maintenance</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>EnableOptimization</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a>
workspace-level service.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>JobSource</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunJobOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>NotificationSettings</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.13.0</h2>
<ul>
<li>Add issue templates (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li>
<li>Added HasRequiredNonBodyField method (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li>
<li>Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li>
<li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in
OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li>
<li>Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li>
<li>Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li>
<li>Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li>
<li>Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Removed <code>Maintenance</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>EnableOptimization</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a>
workspace-level service.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>JobSource</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunJobOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>NotificationSettings</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b4fb746b3b"><code>b4fb746</code></a>
Release v0.13.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/555">#555</a>)</li>
<li><a
href="180c7eea4c"><code>180c7ee</code></a>
Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/540">#540</a>)</li>
<li><a
href="546814a272"><code>546814a</code></a>
Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/554">#554</a>)</li>
<li><a
href="7e680c5ba8"><code>7e680c5</code></a>
Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/550">#550</a>)</li>
<li><a
href="e71ece4ccd"><code>e71ece4</code></a>
Bump google.golang.org/api from 0.130.0 to 0.131.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/551">#551</a>)</li>
<li><a
href="3b4492b6d6"><code>3b4492b</code></a>
Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/549">#549</a>)</li>
<li><a
href="f84de6111a"><code>f84de61</code></a>
Refactor Handling of Name&lt;-&gt;ID Mapping in OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/547">#547</a>)</li>
<li><a
href="c37a894872"><code>c37a894</code></a>
Bump google.golang.org/api from 0.129.0 to 0.130.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/542">#542</a>)</li>
<li><a
href="4f2aa38e75"><code>4f2aa38</code></a>
Bump golang.org/x/mod from 0.11.0 to 0.12.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/541">#541</a>)</li>
<li><a
href="e80f6e16ff"><code>e80f6e1</code></a>
Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/544">#544</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.12.0...v0.13.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.12.0&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Serge Smertin <serge.smertin@databricks.com>
2023-07-18 15:30:00 +00:00
Fabian Jakobs 8cfb1c133e
First look for databricks.yml before falling back to bundle.yml (#580)
## Changes
* Add support for using `databricks.yml` as config file. If
`databricks.yml` is not found then falling back to `bundle.yml` for
backwards compatibility.
* Add support for `.yaml` extension.
* Give an error when more than one config file is found

## Tests
* added unit test
* manual testing the different cases

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-07-18 12:16:34 +02:00
Shreyas Goenka 7fe4113f5b
use logPlan for destroy 2023-07-12 18:24:05 +02:00
Shreyas Goenka 57421f185c
remove confirmation check 2023-07-12 18:02:41 +02:00
Shreyas Goenka 732b4f4b2c
Merge remote-tracking branch 'origin' into plan-deploy-2 2023-07-12 17:55:03 +02:00
shreyas-goenka f00488d81d
Disallow notebooks in paths where files are expected (#573)
## Changes
Uploading a notebook strips it's file extension. This PR returns an
error if a notebook is specified where a file is expected. For example:
A spark python task in a job or `libraries.file.path` DLT library (where
instead `libraries.notebook.path` should be used

This PR also adds test coverage for the opposite case, when files are
not notebooks where notebooks are expected.

## Tests
Integration tests and manually
2023-07-12 12:25:00 +00:00
Andrew Nester 650fb0e8b6
Correctly use --profile flag passed for all bundle commands (#571)
## Changes
Correctly use --profile flag passed for all bundle commands.

Also adds a validation that if bundle configured host mismatches
provided profile, it throws an error.

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-07-12 14:09:25 +02:00
Lennart Kats (databricks) 57e75d3e22
Add development runs (#522)
This implements the "development run" functionality that we desire for DABs in the workspace / IDE.

## bundle.yml changes

In bundle.yml, there should be a "dev" environment that is marked as
`mode: debug`:
```
environments:
  dev:
    default: true
    mode: development # future accepted values might include pull_request, production
```

Setting `mode` to `development` indicates that this environment is used
just for running things for development. This results in several changes
to deployed assets:
* All assets will get '[dev]' in their name and will get a 'dev' tag
* All assets will be hidden from the list of assets (future work; e.g.
for jobs we would have a special job_type that hides it from the list)
* All deployed assets will be ephemeral (future work, we need some form
of garbage collection)
* Pipelines will be marked as 'development: true'
* Jobs can run on development compute through the `--compute` parameter
in the CLI
* Jobs get their schedule / triggers paused
* Jobs get concurrent runs (it's really annoying if your runs get
skipped because the last run was still in progress)

Other accepted values for `mode` are `default` (which does nothing) and
`pull-request` (which is reserved for future use).

## CLI changes

To run a single job called "shark_sighting" on existing compute, use the
following commands:
```
$ databricks bundle deploy --compute 0617-201942-9yd9g8ix
$ databricks bundle run shark_sighting
```

which would deploy and run a job called "[dev] shark_sightings" on the
compute provided. Note that `--compute` is not accepted in production
environments, so we show an error if `mode: development` is not used.

The `run --deploy` command offers a convenient shorthand for the common
combination of deploying & running:
```
$ export DATABRICKS_COMPUTE=0617-201942-9yd9g8ix
$ bundle run --deploy shark_sightings
```
The `--deploy` addition isn't really essential and I welcome feedback 🤔
I played with the idea of a "debug" or "dev" command but that seemed to
only make the option space even broader for users. The above could work
well with an IDE or workspace that automatically sets the target
compute.

One more thing I added is`run --no-wait` can now be used to run
something without waiting for it to be completed (useful for IDE-like
environments that can display progress themselves).
```
$ bundle run --deploy shark_sightings --no-wait
```
2023-07-12 08:51:54 +02:00
shreyas-goenka 47f4d30229
Make top level workspace optional in JSON schema (#562)
## Tests
Tested manually. `"workspace"` is no longer a required field in the
generated JSON schema

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-07-07 13:10:25 +00:00
shreyas-goenka 057f328879
Update inline JSON schema documentation (#557)
## Changes
Add docs for experiments and models to the json schema. Update the
schema to the latest openapi spec.

## Tests
Manually
2023-07-07 13:00:12 +00:00
Pieter Noordhuis b6665f4b30
Update Terraform provider schema structs (#563)
## Changes

Generated from 47857a63c7242fc43aba833cdd28b222fd25c399 (next release
after 1.20).

## Tests

The change is additive and unit tests pass.
2023-07-07 14:52:41 +02:00
Gleb Kanterov 179154477e
Propagate TF_CLI_CONFIG_FILE env variable (#555)
## Changes
Propagate `TF_CLI_CONFIG_FILE` env variable.

From Terraform documentation:

> The location of the Terraform CLI configuration file can also be
specified using the TF_CLI_CONFIG_FILE [environment
variable](https://developer.hashicorp.com/terraform/cli/config/environment-variables)

It allows using custom builds of terraform-provider-databricks, using
config files like:

```tf
provider_installation {
  dev_overrides {
    "databricks/databricks" = "/Users/gleb.kanterov/terraform-provider-databricks"
  }

  direct {}
}
```

## Tests
I added unit tests.
2023-07-07 13:20:37 +02:00
Andrew Nester b14920cd12
Fixed error reporting when included invalid files in include section (#543)
## Changes
Fixed error reporting when included invalid files in include section

Case 1. When the file to include is invalid, throw an error
Case 2. When the file is loaded but the schema is wrong, indicate which
file is failed to load

## Tests

With non-existent notexists.yml

```
databricks bundle deploy
Error: notexists.yml defined in 'include' section does not match any files

```

With malformed notexists.yml
```
databricks bundle deploy
Error: failed to load /Users/andrew.nester/dabs/wheel/notexists.yml: error unmarshaling JSON: json: cannot unmarshal string into Go value of type config.Root
```
2023-07-07 10:22:58 +00:00
stikkireddy 533234f148
Fix: bundle destroy fails when bundle.tf.json file is deleted (#519)
## Changes

Adds the following steps to the destroy phase:
1. interpolate
2. write

Resolves #518 

## Tests

Tested manually due there not being an examples for tests to use.
2023-07-05 21:58:06 +02:00
Shreyas Goenka cd89e5b1ac
fix error not being returned 2023-07-05 16:49:54 +02:00
Shreyas Goenka e87edd7435
Display plan on bundle deploy 2023-07-05 16:40:40 +02:00
Pieter Noordhuis ad8183d7a9
Bump Go SDK to v0.12.0 (#540)
## Changes

* Regenerate CLI commands
* Ignore `account-access-control-proxy` (see #505)

## Tests

Unit and integration tests pass.
2023-07-03 11:46:45 +02:00
Pieter Noordhuis cf92698eb3
Application -> Asset (#529) 2023-06-27 01:31:20 +02:00
stikkireddy 3c1e69a064
Update variable regex to support hyphens (#503)
## Changes

Modified interpolation logic to use:
`\$\{([a-zA-Z]+([-_]*[a-zA-Z0-9]+)*(\.[a-zA-Z]+([-_]*[a-zA-Z0-9]+)*)*)\}`

**Edit**: Suggested by @pietern
`\$\{([a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*)*)\}`
to be more selective and not allow consequent hyphens or underscores to
make the keys more readable.

Explanation:
1. All interpolation starts with `${` and ends with `}`
2. All interpolated locations are split by by `.`
3. All sections are expected to start with a alphabet `[a-zA-Z]`; no
numbers, hyphens or underscores.
4. All sections are expected to end with an alphanumeric `[a-zA-Z0-9]`
no hyphens or underscores

This change allows the current interpolation to be more permissive.

**Note** it does break backwards compatibility because `[a-zA-Z] !=
[\w]`. `\w` includes alphanumeric and underscores. `\w = [a-zA-Z0-9_]`

## Tests
There are two tests with examples of valid and invalid interpolation and
a test to validate expansion.
2023-06-23 12:56:54 +02:00
Gleb Kanterov ccbcccd929
Add DATABRICKS_BUNDLE_TMP env variable (#462)
## Changes
Add DATABRICKS_BUNDLE_TMP env variable. It allows using a temporary
directory instead of writing to `$CWD/.databricks/bundle`

## Tests
I added unit tests

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-06-21 09:53:54 +02:00
dependabot[bot] 7fb34e4767
Bump github.com/databricks/databricks-sdk-go from 0.9.1-0.20230614092458-b5bbc1c8dabb to 0.10.0 (#497) 2023-06-21 07:43:07 +00:00
stikkireddy ddeb7487b3
Update Terraform provider schema structs (#504)
## Changes

Generated from provider version 1.19.0.

## Tests

Ran the existing unit tests and they seemed to have checked out.
2023-06-21 08:40:11 +02:00
shreyas-goenka 5d036ab6b8
Fix locker unlock for destroy (#492)
## Changes
Adds ability for allowing unlock to succeed even if the deploy file is
missing.
 
## Tests
Using integration tests and manually
2023-06-19 15:57:25 +02:00
shreyas-goenka de47cf19f1
Use better error assertions and clean up locker API (#490)
## Changes
Some cleanup work

## Tests
Locker integration test passes
2023-06-16 16:29:04 +02:00
Pieter Noordhuis 1875908b59
Pass through proxy related environment variables (#465)
## Changes

If set on the host, we must pass them through to Terraform.

## Tests

Unit tests pass.
2023-06-14 21:58:26 +02:00
Serge Smertin 2aa61a7c1b
Update with the latest Go SDK (#457)
## Changes
- removed deprecated methods
- regenerated with the latest OpenAPI spec
- picked up the latest go SDK version

## Tests
`make test`
2023-06-12 14:23:21 +02:00
Pieter Noordhuis 894d25e434
Check for nil environment before accessing it (#453) 2023-06-08 20:55:49 +00:00
stikkireddy 402fcdd62c
Skip path translation of job task for jobs with a Git source (#404)
## Changes

Added skipping of translating paths for notebook path in notebook tasks
and python file path in spark python tasks if the git source is not null.

Resolves: #402

## Tests

There is a unit test and also tested with a sample bundle:

```
resources:
  jobs:
    demo:
      git_source:
        git_branch: master
        git_provider: github
        git_url: https://github.com/test/dummy
   ....
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-06-07 12:34:59 +02:00
Pieter Noordhuis 349e2aff40
Allow equivalence checking of filer errors to fs errors (#416)
## Changes

The pattern `errors.Is(err, fs.ErrNotExist)` is common to check for an
error type.

Errors can implement `Is(error) bool` with a custom equivalence checker.

## Tests

New asserts all pass in the integration test.
2023-05-31 20:47:00 +02:00
Pieter Noordhuis e4ab455ea1
Don't pass synthesized TMPDIR if not already set (#409)
## Changes

On Unix systems, the default of `/tmp` always works. No need to
synthesize a path for it.

The custom TMPDIR was causing issues when used from GitHub Actions
runners.

## Tests

Confirmed manually this fixes the issue on GitHub Actions runners.
2023-05-26 13:05:30 +02:00
Andrew Nester 6141476ca2
Added support for bundle.Seq, simplified Mutator.Apply interface (#403)
## Changes
Added support for `bundle.Seq`, simplified `Mutator.Apply` interface by
removing list of mutators from return values/

## Tests
1. Ran `cli bundle deploy` and interrupted it with Cmd + C mid execution
so lock is not released
2. Ran `cli bundle deploy` top make sure that CLI is not trying to
release lock when it fail to acquire it
```
andrew.nester@HFW9Y94129 multiples-tasks % cli bundle deploy
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/simple-task/development/files!

^C
andrew.nester@HFW9Y94129 multiples-tasks % cli bundle deploy
Error: deploy lock acquired by andrew.nester@databricks.com at 2023-05-24 12:10:23.050343 +0200 CEST. Use --force to override
```
2023-05-24 14:45:19 +02:00
Andrew Nester 273271bc59
Regenerated internal schema structs based on Terraform provider schemas (#401)
## Changes
Regenerated internal schema structs based on Terraform provider schemas

Allows to use `serverless` flag in bundle config.

## Tests
Ran `cli bundle deploy` with bundle which contains pipeline with
serverless key true
2023-05-23 19:33:24 +02:00
shreyas-goenka c53ad860e6
Create tmp files in the cache dir in terraform command runs (#395)
## Changes
Passes through tmp dir related env vars to the terraform process. Incase
any of them are not set, we assign temp dir inside bundle cache dir as
the location terraform should use.

## Tests
Manually checked that these env vars do override location where
os.CreateTemp files are created
2023-05-23 13:51:15 +02:00
Fabian Jakobs 055e528173
Rename: bricks -> databricks (#393)
## Changes
related to https://github.com/databricks/databricks-vscode/pull/721

## Rename env vars

`BRICKS_CLI_PATH` -> `DATABRICKS_CLI_PATH`
`BRICKS_OUTPUT_FORMAT` -> `DATABRICKS_OUTPUT_FORMAT`
`BRICKS_LOG_FILE` -> `DATABRICKS_LOG_FILE`
`BRICKS_LOG_LEVEL` -> `DATABRICKS_LOG_LEVEL`
`BRICKS_LOG_FORMAT` -> `DATABRICKS_LOG_FORMAT`
`BRICKS_PROGRESS_FORMAT` -> `DATABRICKS_CLI_PROGRESS_FORMAT`
`BRICKS_UPSTREAM` -> `DATABRICKS_CLI_UPSTREAM`
`BRICKS_UPSTREAM_VERSION` -> `DATABRICKS_CLI_UPSTREAM_VERSION`
2023-05-22 16:40:50 +02:00
Pieter Noordhuis 8979ed1394
Fix tests for new repository name (#390) 2023-05-16 19:02:07 +02:00
Pieter Noordhuis 98ebb78c9b
Rename bricks -> databricks (#389)
## Changes

Rename all instances of "bricks" to "databricks".

## Tests

* Confirmed the goreleaser build works, uses the correct new binary
name, and produces the right archives.
* Help output is confirmed to be correct.
* Output of `git grep -w bricks` is minimal with a couple changes
remaining for after the repository rename.
2023-05-16 18:35:39 +02:00
Andrew Nester 180dfc9a40
Added ability for deferred mutator execution (#380)
## Changes
Added `DeferredMutator` and `bundle.Defer` function which allows to
always execute some mutators either in the end of execution chain or
after error occurs in the middle of execution chain.

Usage as follows:

```
deferredMutator := bundle.Defer([]bundle.Mutator{
    lock.Acquire()
    transform.DoSomething(),
    //...
}, []bundle.Mutator{
    lock.Release(),
})
```
In such case `lock.Release()` will always be executed: either when all
operations above succeed or when any of them fails

## Tests
Before the change

```
andrew.nester@HFW9Y94129 multiples-tasks % bricks bundle deploy
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/simple-task/development/files!

Error: terraform not initialized
andrew.nester@HFW9Y94129 multiples-tasks % bricks bundle deploy
Error: deploy lock acquired by andrew.nester@databricks.com at 2023-05-10 16:41:22.902659 +0200 CEST. Use --force to override

```

After the change
```
andrew.nester@HFW9Y94129 multiples-tasks % bricks bundle deploy 
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/simple-task/development/files!

Error: terraform not initialized
andrew.nester@HFW9Y94129 multiples-tasks % bricks bundle deploy
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/simple-task/development/files!

Error: terraform not initialized
```
2023-05-16 18:01:50 +02:00
Andrew Nester 33fb0b3c40
Do not truncate local state file when pulling remote changes (#382)
## Changes
When local state file exists it won't be override by remote state file

## Tests
Running `bricks bundle deploy` after state push failed does not override
local state file

Use cases verified:
1. Local state file is newer than remote
2. Local state file is older than remote
3. Local state file does not exist
4. Local state file corrupted
2023-05-16 17:02:33 +02:00
shreyas-goenka dd04875ee9
Add config environment support for variable overriding (#383)
## Changes
Allows to override default value for a variable definition from the
environment block in a bundle config. See bundle.yml for example usage

## Tests
Unit tests

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-05-15 14:07:18 +02:00
shreyas-goenka c5e940f664
Add support for variables in bundle config (#359)
## Changes
This PR now allows you to define variables in the bundle config and set
them in three ways
1. command line args
2. process environment variable
3. in the bundle config itself

## Tests
manually, unit, and black box tests

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-05-15 11:34:05 +02:00
Andrew Nester 473d2bf503
Improved error message when 'bricks bundle run' is executed before 'bricks bundle deploy' (#378)
## Changes
Improved error message when 'bricks bundle run' is executed before
'bricks bundle deploy'

The error happens when we attempt to load terraform state when it does
not exist.

The best way to check if terraform state actually exists is to call
`terraform show -json` and that's what already happens here

https://github.com/databricks/bricks/compare/main...error-before-deploy#diff-8c50f8c04e568397bc865b7e02d1f4ec5b18379d8d32daddfeb041035d804f5fL28

Absence of `state.Values` indicates that there is no state and likely
bundle was just never deployed.

## Tests
Ran `bricks bundle run test_job` on a new non-deployed bundle.

**Output:**

`Error: terraform show: No state. Did you forget to run 'bricks bundle
deploy'?`

Running `bricks bundle deploy && bricks bundle run test_job` succeeds.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-05-10 11:02:25 +02:00
Andrew Nester 1916bc9d68
Fixed printing the tasks in job output in DAG execution order (#377)
Fixes #259

## Changes
Sort task output in an execution order based on task end time

## Tests
Added `TestTaskJobOutputOrderToString` unit test.
2023-05-08 16:35:47 +02:00
shreyas-goenka 37af3d5c4f
Add omitempty tag to bundle git details (#372)
## Changes
Add omit empty tag to git details. Otherwise this field becomes a
required field in the config json schema

## Tests
Tested by regenerating the json schema and checking that the git field
is now optional
2023-05-01 14:34:12 +02:00
shreyas-goenka 9e16140b6e
Add git config block to bundle config (#356)
## Changes
This config block contains commit, branch and remote_url which will be
automatically loaded if specified in the repo, and can also be specified
by the user

## Tests
Unit and black-box tests
2023-04-26 16:54:36 +02:00
Serge Smertin 9581187c9e
Update to Go SDK v0.8.0 (#351)
## Changes

- Update to Go SDK v0.8.0
- Fix all breaking changes

## Tests

- make test
2023-04-21 10:30:20 +02:00
shreyas-goenka 9b06095e47
Add support for multiple level string variable interpolation (#342)
## Changes
Traverses the variables referred in a depth first manner to resolve
string fields.
Errors out if a cycle is detected

## Tests
Manually and unit/blackbox tests
2023-04-20 01:13:33 +02:00
shreyas-goenka 089bebc92f
Do not print exceptions for non ERROR events (#347)
## Changes
Adds a check to not print exceptions trace for dlt events with a level <
ERROR

## Tests
Unit test
2023-04-19 22:11:05 +02:00
shreyas-goenka 598ad62688
Log mutator messages using progress logger (#312)
This PR uses progress logger to log messages inside mutators
2023-04-18 16:55:06 +02:00
shreyas-goenka d0872b45e2
Log pipeline update errors using progress logger (#338)
## Changes
Logs error message for all exceptions

## Tests
Manually and using unit tests
2023-04-18 15:00:34 +02:00
shreyas-goenka 59eee11989
Log job errors using progress logger (#337)
## Changes
This PR logs job errors using the progress logger

## Tests
Manually
2023-04-18 14:58:20 +02:00
shreyas-goenka 1a7b3eef18
Log job run url using progress logger (#336)
## Changes
Logs the job url using the progress logger

## Tests
Manually
2023-04-18 14:40:45 +02:00
shreyas-goenka 85889dffb1
Move state to event for whether they support inplace progress logging (#339)
## Changes
Adds a IsInplaceSupported() function to the event interface. Any event
that now uses the progress logger has to declare whether they support in
place logging

## Tests
Manually
2023-04-18 14:20:35 +02:00
shreyas-goenka 93d57dd00f
Detect duplicate identifiers in bundle config (#332)
## Changes
This PR adds checks during bundle config load and merge to error out if
there are duplicate keys for resource definitions

## Tests
Using unit tests and manually
2023-04-17 12:21:21 +02:00
Shreyas Goenka eab29603fc
Revert "Log job errors using progress logger"
This reverts commit a2e20f5206.
2023-04-15 15:19:32 +02:00
Shreyas Goenka a2e20f5206
Log job errors using progress logger 2023-04-15 15:18:38 +02:00
shreyas-goenka e8018a7209
Refactor output and progress into separate packages in run (#335)
Tested manually that output and progress logging still works
2023-04-14 14:40:34 +02:00
shreyas-goenka df0293510e
Fixes for pipeline progress logging (#330)
## Changes
1. Events are now printed in chronological order
2. Simplify events rendering by removing update/flow name. This makes it
more consistent with the web UI too
3. Switch to server side filtering on update_id

## Tests
Manually

Happy run:
```
shreyas.goenka@THW32HFW6T pipeline-progress % bricks bundle run foo
2023-04-12T20:00:22.879Z update_progress INFO "Update e1becc is INITIALIZING."
2023-04-12T20:00:22.906Z update_progress INFO "Update e1becc is SETTING_UP_TABLES."
2023-04-12T20:00:24.496Z update_progress INFO "Update e1becc is RUNNING."
2023-04-12T20:00:24.497Z flow_progress   INFO "Flow 'sales_orders_raw' is QUEUED."
2023-04-12T20:00:24.586Z flow_progress   INFO "Flow 'sales_orders_raw' is STARTING."
2023-04-12T20:00:24.748Z flow_progress   INFO "Flow 'sales_orders_raw' is RUNNING."
2023-04-12T20:00:26.672Z flow_progress   INFO "Flow 'sales_orders_raw' has COMPLETED."
2023-04-12T20:00:27.753Z update_progress INFO "Update e1becc is COMPLETED."
```

Sad run:
```
shreyas.goenka@THW32HFW6T pipeline-progress % bricks bundle run foo
2023-04-12T20:02:07.764Z update_progress INFO "Update 04b80e is INITIALIZING."
2023-04-12T20:02:07.870Z update_progress ERROR "Update 04b80e is FAILED."
Error: update failed
```
2023-04-14 12:21:44 +02:00
shreyas-goenka 3894d5796d
Add progress logging event for pipeline update URLs (#331)
## Changes
<!-- Summary of your changes that are easy to understand -->
Output now: 
```
shreyas.goenka@THW32HFW6T pipeline-progress % bricks bundle run foo
The update can be found at https://e2-dogfood.staging.cloud.databricks.com/#joblist/pipelines/1cc605db-daab-4218-b38a-a63030e3eb03/updates/f92f2159-1141-47de-b1e2-1ca854b7238f

2023-04-12T20:41:19.813Z update_progress INFO "Update f92f21 is INITIALIZING."
2023-04-12T20:41:19.841Z update_progress INFO "Update f92f21 is SETTING_UP_TABLES."
2023-04-12T20:41:21.270Z update_progress INFO "Update f92f21 is RUNNING."
2023-04-12T20:41:21.271Z flow_progress   INFO "Flow 'sales_orders_raw' is QUEUED."
2023-04-12T20:41:21.349Z flow_progress   INFO "Flow 'sales_orders_raw' is STARTING."
2023-04-12T20:41:21.480Z flow_progress   INFO "Flow 'sales_orders_raw' is RUNNING."
2023-04-12T20:41:23.493Z flow_progress   INFO "Flow 'sales_orders_raw' has COMPLETED."
2023-04-12T20:41:25.484Z update_progress INFO "Update f92f21 is COMPLETED."
```

## Tests
<!-- How is this tested? -->
2023-04-14 11:11:30 +02:00
shreyas-goenka 417839021b
Add top level docs for bundle json schema (#313)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
2023-04-12 21:43:53 +02:00
Pieter Noordhuis b388f4a0dc
Make all workspace paths string fields (#327)
## Changes

These are unlikely to ever be DBFS paths so we can remove this level of indirection to simplify.

**Note:** this is a breaking change. Downstream usage of these fields must be updated.

## Tests

Existing tests pass.
2023-04-12 16:54:36 +02:00
Pieter Noordhuis 31ccebd62a
Store relative path to configuration file for every resource (#322)
## Changes

If a configuration file is located in a subdirectory of the bundle root,
files referenced from that configuration file should be relative to its
configuration file's directory instead of the bundle root.

## Tests

* New tests in `bundle/config/mutator/translate_paths_test.go`.
* Existing tests under `bundle/tests` pass and are augmented to assert
on paths.

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2023-04-12 16:17:13 +02:00
Miles Yucht 946906221d
Delete sync snapshots file when destroying a bundle (#323)
## Changes
This PR changes the files.Delete() mutator to delete the sync snapshots
file on destroy. This ensures that files will be uploaded when the
bundle is uploaded again.

## Tests
- [x] Manual test: Ran `bricks bundle destroy`, observed that the sync
snapshots file was deleted.
2023-04-11 16:57:01 +02:00
Pieter Noordhuis 42d29f92c9
Pass through $HOME when invoking Terraform (#319)
## Changes

This is useful when developing the Databricks Terraform provider where
you keep a local-only build of the provider and refer to it using $HOME
from `~/.terraformrc`, for example like this:

```
plugin_cache_dir = "$HOME/.terraform.d/plugin-cache"
```

## Tests

That $HOME is passed through cannot be tested as is because the
`tfexec.Terraform` struct doesn't expose it through public fields or
methods. What can be tested is a successful run of the initialize
mutator and this is included in this commit.
2023-04-11 13:11:31 +02:00
shreyas-goenka 4871f7bc8a
Add bundle destroy command (#300)
Adds bundle destroy capability to bricks
2023-04-06 12:54:58 +02:00
shreyas-goenka 6feaed4990
Fix host based auth conflicting with DEFAULT profile (#309)
## Changes
Consider the following host based configuration:
```
bundle:
  name: job_with_file_task

workspace:
  host: https://e2-dogfood.staging.cloud.databricks.com/
```

If you have a DEFAULT profile, then this host is ignored. The solution
proposed here is to remove the profile config loader if host is
explicitly specified in the bundle config.

This does come with a cost, namely that if a `DATABRICKS_CONFIG_PROFILE`
env var will be ignored, which maybe goes against unified auth spec

The ideal solution here is probably to make a change to go-SDK to not
select DEFAULT profile if host is not empty

## Tests
<!-- How is this tested? -->
2023-04-05 18:12:11 +02:00
Pieter Noordhuis d7ac265536
Allow use of file library in pipeline (#308)
## Changes

This requires databricks/databricks-sdk-go#359.

## Tests

Tests pass and ran manual verification of deployment with files.
2023-04-05 16:29:42 +02:00
Pieter Noordhuis 4e4c0658db
Interpolate paths for job tasks that reference files (#306)
## Changes

This change also swaps the order of mutators such that interpolation
happens before path translation. This means that is is possible to use
variables (e.g. `${bundle.environment}`) in notebook or file paths.

## Tests

New tests pass and verified manually.
2023-04-05 16:02:17 +02:00
shreyas-goenka 7427ceba6c
Fix output panic (#311)
## Changes
<!-- Summary of your changes that are easy to understand -->

Output now:
```
{
  "run_page_url": "https://e2-dogfood.staging.cloud.databricks.com/?o=6051921418418893#job/6199333392110/run/1088443776202122",
  "task_outputs": {
    "input": null,
    "process": {
      "logs": "[Row(max(id)=9)]\n",
      "logs_truncated": false
    }
  }
}
```

## Tests
<!-- How is this tested? -->
2023-04-05 15:55:24 +02:00
shreyas-goenka 8de7d32ed1
Add readonly bundle tag for internal fields (#302)
This PR adds a bundle: "readonly" struct tag to the json schema
generator. This allows us to skip generating json schema for internal
readonly fields

Tested using unit test
2023-04-04 12:16:07 +02:00
shreyas-goenka ddbb17b0d9
Regenerate generated empty json schema docs (#301)
## Changes
<!-- Summary of your changes that are easy to understand -->

## Tests
<!-- How is this tested? -->
2023-04-04 12:07:30 +02:00
dependabot[bot] 57cf66d3a8
Bump github.com/databricks/databricks-sdk-go from 0.5.0 to 0.6.0 (#299) 2023-04-03 21:33:21 +02:00
Pieter Noordhuis f26806be8f
Set BRICKS_CLI_PATH only if it cannot be derived from $PATH (#298)
## Changes

Related to #237.

Output of `bricks auth env` now doesn't include `BRICKS_CLI_PATH` if it
can be found in $PATH.

## Tests

Verified manually.
2023-04-03 16:23:53 +02:00
shreyas-goenka b4a30c641c
Add progress logging for pipeline runs (#283)
Add progress logging for pipeline runs
2023-03-31 17:04:12 +02:00
Pieter Noordhuis 04e77102c9
Add mutators to pull and push Terraform state (#288)
## Changes

Pull state before deploying and push state after deploying.

Note: the run command was missing mutators to initialize Terraform. This
is necessary if the cache directory is removed between running "deploy"
and "run" (which is valid now that we synchronize state).

## Tests

Manually.
2023-03-30 12:01:09 +02:00
Pieter Noordhuis 0ea0e81c8a
Ignore databricks_permissions resource when loading Terraform state (#291)
## Changes

The databricks_permissions resource may be generated if a bundle
resource includes a `permissions` block. There's no need to incorporate
details from the materialization into the bundle configuration struct.

## Tests

Confirmed that this fixes `bricks bundle run` when dealing with a bundle
with permission configuration.
2023-03-29 21:14:52 +02:00
Pieter Noordhuis 87207bba78
Configure Terraform provider auth through env vars (#290)
## Changes

Auth relied on setting a profile. In this change we enumerate all
configuration properties and export all non-empty ones as a map with
environment variables. We then pass this map to the Terraform execution
wrapper.

This results in Terraform using the bundle's authentication
configuration.

This change is needed to make #287 work.

## Tests

Manually.
2023-03-29 20:46:09 +02:00
Pieter Noordhuis cfd32c9602
Try to resolve a profile if only the host is specified (#287)
## Changes

This improves out of the box usability where a user who already
configured a `.databrickscfg` file will be able to reference the
workspace host in their `bundle.yml` and it will automatically pick up
the right profile.

## Tests

* Newly added tests pass.
* Manual testing confirms intended behavior.

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2023-03-29 20:44:19 +02:00
Pieter Noordhuis 8af934bbbb
Function to find the Git repository containing a bundle (#289)
## Changes

Useful functions from #277.

## Tests

Tests pass.
2023-03-29 16:36:35 +02:00
shreyas-goenka 8fd3dccca9
Add progress logs for job runs (#276) 2023-03-29 14:58:09 +02:00
Pieter Noordhuis edd8630f71
Log mutator phase at info level (#272) 2023-03-22 17:02:22 +01:00
Pieter Noordhuis 123a5e15e9
Acquire lock prior to deploy (#270)
Add configuration:

```
bundle:
  lock:
    enabled: true
    force: false
```

The force field can be set by passing the `--force` argument to `bricks
bundle deploy`. Doing so means the deployment lock is acquired even if
it is currently held. This should only be used in exceptional cases
(e.g. a previous deployment has failed to release the lock).
2023-03-22 16:37:26 +01:00
Pieter Noordhuis 6850caf2a2
Include mutator name in logging context (#271) 2023-03-22 15:54:10 +01:00
shreyas-goenka bfa20cdec9
Add json tags to output fields (#269)
output now:
```
{
  "run_page_url": "https://adb-309687753508875.15.azuredatabricks.net/?o=309687753508875#job/1077573342009637/run/19099317",
  "task_outputs": {
    "my_notebook_task": {
      "result": "computed results from notebook."
    }
  }
}%
```
2023-03-21 18:38:11 +01:00
shreyas-goenka 75d516939b
Error out if notebook file does not exist locally (#261)
Adds check for whether file exists locally

case 1: local (relative) file does not exist
```
    foo:
      name: "[job-output] test-job by shreyas"

      tasks:
        - task_key: my_notebook_task
          existing_cluster_id: ***
          notebook_task:
            notebook_path: "./doesnotexist"
```
output:
```
shreyas.goenka@THW32HFW6T job-output % bricks bundle deploy
Error: notebook ./doesnotexist not found. Error: open /Users/shreyas.goenka/projects/job-output/doesnotexist: no such file or directory
```


case 2: remote (absolute) file does not exist
```
    foo:
      name: "[job-output] test-job by shreyas"

      tasks:
        - task_key: my_notebook_task
          existing_cluster_id: ***
          notebook_task:
            notebook_path: "/Users/shreyas.goenka@databricks.com/doesnotexist"
```

output:
```
shreyas.goenka@THW32HFW6T job-output % bricks bundle deploy
shreyas.goenka@THW32HFW6T job-output % bricks bundle run foo
Error: failed to reach TERMINATED or SKIPPED, got INTERNAL_ERROR: Task my_notebook_task failed with message: Notebook not found: /Users/shreyas.goenka@databricks.com/doesnotexist. This caused all downstream tasks to get skipped.
```

case 3: remote exists
Successful deploy and run
2023-03-21 18:13:16 +01:00
shreyas-goenka 047a189c1e
Add job run output logging (#260)
This PR adds output logging for job runs

Tested using unit tests and manually
2023-03-21 16:25:18 +01:00
shreyas-goenka 4ac2e33def
Throw error when job run is skipped due to max_concurrent_runs (#257)
Tested manually:

Before we did not have get any errors/logs and silently failed in this
case

```
shreyas.goenka@THW32HFW6T job-output % bricks bundle run foo
Error: run skipped: Skipping this run because the limit of 1 maximum concurrent runs has been reached.
```
2023-03-21 13:17:15 +01:00
Pieter Noordhuis 66ca9ec266
Add permissions block to each resource (#264)
Example:

```yaml
resources:
  jobs:
    my_job:
      name: "[${bundle.environment}] My job"
      permissions:
        - level: CAN_VIEW
          group_name: users
```
2023-03-21 10:58:16 +01:00