Commit Graph

51 Commits

Author SHA1 Message Date
Arpit Jasapara 24cc67563e
Support Unity Catalog Registered Models in bundles (#846)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add UC Registered Models support to Databricks Asset Bundles as new
resource `registered_model`. Also added UC Permission support via new
resource `grant`.

## Tests
<!-- How is this tested? -->
Tested via unit tests and manual testing with [example
PR](https://github.com/databricks/bundle-examples-internal/pull/80) and
[custom Terraform
provider](https://github.com/databricks/terraform-provider-databricks/pull/2771).
<img width="698" alt="Screenshot 2023-10-08 at 4 57 23 PM"
src="https://github.com/databricks/cli/assets/87999496/bcf605a9-7894-443b-865a-f7e240037815">
<img width="1109" alt="Screenshot 2023-10-08 at 4 56 47 PM"
src="https://github.com/databricks/cli/assets/87999496/e4d6e424-cd70-4809-8843-6939ed2e172f">
<img width="1091" alt="Screenshot 2023-10-08 at 4 56 57 PM"
src="https://github.com/databricks/cli/assets/87999496/88ebaabb-67db-4a11-88a5-df087e2e41c0">

---------

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
Co-authored-by: Andrew Nester <andrew.nester.dev@gmail.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-10-16 15:32:49 +00:00
shreyas-goenka fe32c46dc8
Make bundle deploy work if no resources are defined (#767)
## Changes

This PR sets "resource" to nil in the terraform representation if no
resources are defined in the bundle configuration. This solves two
problems:

1. Makes bundle deploy work without any resources specified. 
2. Previously if a `resources` block was removed after a deployment,
that would fail with an error. Now the resources would get destroyed as
expected.

Also removes `TerraformHasNoResources` which is no longer needed.

## Tests
New e2e tests.
2023-09-13 22:50:37 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
Andrew Nester f7566b8264
Close local Terraform state file when pushing to remote (#752)
## Changes
Close local Terraform state file when pushing to remote

Should help fix E2E test cleanup
```
testing.go:1225: TempDir RemoveAll cleanup: remove 
C:\Users\RUNNER~1\AppData\Local\Temp\TestAccPythonWheelTaskDeployAndRun1395546390\001\.databricks\bundle\default\terraform\terraform.tfstate: 
The process cannot access the file because it is being used by another process.
```
2023-09-08 10:47:17 +00:00
Arpit Jasapara 50eaf16307
Support Model Serving Endpoints in bundles (#682)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add Model Serving Endpoints to Databricks Bundles

## Tests
<!-- How is this tested? -->
Unit tests and manual testing via
https://github.com/databricks/bundle-examples-internal/pull/76
<img width="1570" alt="Screenshot 2023-08-28 at 7 46 23 PM"
src="https://github.com/databricks/cli/assets/87999496/7030ebd8-b0e2-4ad1-a9e3-5ff8454f1175">
<img width="747" alt="Screenshot 2023-08-28 at 7 47 01 PM"
src="https://github.com/databricks/cli/assets/87999496/fb9b54d7-54e2-43ce-9148-68fb620c809a">

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
2023-09-07 21:54:31 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00
Pieter Noordhuis c0ebfb8101
Fix conversion of job parameters (#744)
## Changes

Another example of singular/plural conversion.

Longer term solution is we do a full sweep of the type using reflection
to make sure we cover all fields.

## Tests

Unit test passes.
2023-09-07 12:48:59 +00:00
Lennart Kats (databricks) f9e521b43e
databricks bundle init template v2: optional stubs, DLT support (#700)
## Changes

This follows up on https://github.com/databricks/cli/pull/686. This PR
makes our stubs optional + it adds DLT stubs:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
Include a stub (sample) notebook in 'my_project/src' [yes]: yes
Include a stub (sample) DLT pipeline in 'my_project/src' [yes]: yes
Include a stub (sample) Python package 'my_project/src' [yes]: yes
 Successfully initialized template
```

## Tests
Manual testing, matrix tests.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 09:52:31 +00:00
Pieter Noordhuis fabe8e88b8
Include $PATH in set of environment variables to pass along. (#736)
## Changes

This is necessary to ensure that our Terraform provider can use the same
auxiliary programs (e.g. `az`, or `gcloud`) as the CLI.

## Tests

Unit test and manual verification.
2023-09-06 07:54:35 +00:00
Pieter Noordhuis 46b999ed42
Pin Terraform binary version to 1.5.5 (#715)
## Changes

The installer doesn't respect the version constraints if they are
specified.

Source: [the vc argument is not
used](850464c601/releases/latest_version.go (L158-L177)).

## Tests

Confirmed manually.
2023-08-30 14:08:37 +00:00
Andrew Nester e3e9bc6def
Added support for sync.include and sync.exclude sections (#671)
## Changes
Added support for `sync.include` and `sync.exclude` sections

## Tests
Added `sample-java` folder to gitignore

```
bundle:
  name: wheel-task

sync:
  include:
    - "./sample-java/*.kts"
```

Kotlin files were correctly synced.

```
[DEBUG] Test execution command:  /opt/homebrew/opt/go@1.21/bin/go test ./... -json -timeout 1h -coverpkg=./... -coverprofile=coverage.txt -run ^TestAcc
[DEBUG] Test execution directory:  /Users/andrew.nester/cli
2023/08/17 17:12:10 [INFO]  TestAccAlertsCreateErrWhenNoArguments (2.320s)
2023/08/17 17:12:10 [INFO]  TestAccApiGet (0.650s)
2023/08/17 17:12:12 [INFO]  TestAccClustersList (1.060s)
2023/08/17 17:12:12 [INFO]  TestAccClustersGet (0.760s)
2023/08/17 17:12:26 [INFO]  TestAccFilerWorkspaceFilesReadWrite (13.270s)
2023/08/17 17:12:32 [INFO]  TestAccFilerWorkspaceFilesReadDir (6.860s)
2023/08/17 17:12:46 [INFO]  TestAccFilerDbfsReadWrite (13.380s)
2023/08/17 17:12:53 [INFO]  TestAccFilerDbfsReadDir (7.460s)
2023/08/17 17:13:01 [INFO]  TestAccFilerWorkspaceNotebookConflict (7.920s)
2023/08/17 17:13:10 [INFO]  TestAccFilerWorkspaceNotebookWithOverwriteFlag (9.290s)
2023/08/17 17:13:10 [INFO]  TestAccFilerLocalReadWrite (0.010s)
2023/08/17 17:13:11 [INFO]  TestAccFilerLocalReadDir (0.010s)
2023/08/17 17:13:14 [INFO]  TestAccFsCatForDbfs (3.180s)
2023/08/17 17:13:15 [INFO]  TestAccFsCatForDbfsOnNonExistentFile (0.940s)
2023/08/17 17:13:15 [INFO]  TestAccFsCatForDbfsInvalidScheme (0.560s)
2023/08/17 17:13:18 [INFO]  TestAccFsCatDoesNotSupportOutputModeJson (2.910s)
2023/08/17 17:13:51 [INFO]  TestAccFsCpDir (32.730s)
2023/08/17 17:14:06 [INFO]  TestAccFsCpFileToFile (14.740s)
2023/08/17 17:14:20 [INFO]  TestAccFsCpFileToDir (14.340s)
2023/08/17 17:14:53 [INFO]  TestAccFsCpDirToDirFileNotOverwritten (32.710s)
2023/08/17 17:15:12 [INFO]  TestAccFsCpFileToDirFileNotOverwritten (19.590s)
2023/08/17 17:15:32 [INFO]  TestAccFsCpFileToFileFileNotOverwritten (19.950s)
2023/08/17 17:16:11 [INFO]  TestAccFsCpDirToDirWithOverwriteFlag (38.970s)
2023/08/17 17:16:32 [INFO]  TestAccFsCpFileToFileWithOverwriteFlag (21.040s)
2023/08/17 17:16:52 [INFO]  TestAccFsCpFileToDirWithOverwriteFlag (19.670s)
2023/08/17 17:16:54 [INFO]  TestAccFsCpErrorsWhenSourceIsDirWithoutRecursiveFlag (1.890s)
2023/08/17 17:16:54 [INFO]  TestAccFsCpErrorsOnInvalidScheme (0.690s)
2023/08/17 17:17:10 [INFO]  TestAccFsCpSourceIsDirectoryButTargetIsFile (15.810s)
2023/08/17 17:17:14 [INFO]  TestAccFsLsForDbfs (4.000s)
2023/08/17 17:17:18 [INFO]  TestAccFsLsForDbfsWithAbsolutePaths (4.000s)
2023/08/17 17:17:21 [INFO]  TestAccFsLsForDbfsOnFile (3.140s)
2023/08/17 17:17:23 [INFO]  TestAccFsLsForDbfsOnEmptyDir (2.030s)
2023/08/17 17:17:24 [INFO]  TestAccFsLsForDbfsForNonexistingDir (0.840s)
2023/08/17 17:17:25 [INFO]  TestAccFsLsWithoutScheme (0.590s)
2023/08/17 17:17:27 [INFO]  TestAccFsMkdirCreatesDirectory (2.310s)
2023/08/17 17:17:30 [INFO]  TestAccFsMkdirCreatesMultipleDirectories (2.800s)
2023/08/17 17:17:33 [INFO]  TestAccFsMkdirWhenDirectoryAlreadyExists (2.700s)
2023/08/17 17:17:35 [INFO]  TestAccFsMkdirWhenFileExistsAtPath (2.870s)
2023/08/17 17:17:40 [INFO]  TestAccFsRmForFile (4.030s)
2023/08/17 17:17:43 [INFO]  TestAccFsRmForEmptyDirectory (3.470s)
2023/08/17 17:17:46 [INFO]  TestAccFsRmForNonEmptyDirectory (3.350s)
2023/08/17 17:17:47 [INFO]  TestAccFsRmForNonExistentFile (0.940s)
2023/08/17 17:17:51 [INFO]  TestAccFsRmForNonEmptyDirectoryWithRecursiveFlag (3.570s)
2023/08/17 17:17:52 [INFO]  TestAccGitClone (0.890s)
2023/08/17 17:17:52 [INFO]  TestAccGitCloneWithOnlyRepoNameOnAlternateBranch (0.730s)
2023/08/17 17:17:53 [INFO]  TestAccGitCloneErrorsWhenRepositoryDoesNotExist (0.540s)
2023/08/17 17:18:02 [INFO]  TestAccLock (8.800s)
2023/08/17 17:18:06 [INFO]  TestAccLockUnlockWithoutAllowsLockFileNotExist (3.930s)
2023/08/17 17:18:09 [INFO]  TestAccLockUnlockWithAllowsLockFileNotExist (3.320s)
2023/08/17 17:18:20 [INFO]  TestAccSyncFullFileSync (10.570s)
2023/08/17 17:18:31 [INFO]  TestAccSyncIncrementalFileSync (11.460s)
2023/08/17 17:18:42 [INFO]  TestAccSyncNestedFolderSync (10.850s)
2023/08/17 17:18:53 [INFO]  TestAccSyncNestedFolderDoesntFailOnNonEmptyDirectory (10.650s)
2023/08/17 17:19:04 [INFO]  TestAccSyncNestedSpacePlusAndHashAreEscapedSync (10.930s)
2023/08/17 17:19:11 [INFO]  TestAccSyncIncrementalFileOverwritesFolder (7.010s)
2023/08/17 17:19:18 [INFO]  TestAccSyncIncrementalSyncPythonNotebookToFile (7.380s)
2023/08/17 17:19:24 [INFO]  TestAccSyncIncrementalSyncFileToPythonNotebook (6.220s)
2023/08/17 17:19:30 [INFO]  TestAccSyncIncrementalSyncPythonNotebookDelete (5.530s)
2023/08/17 17:19:32 [INFO]  TestAccSyncEnsureRemotePathIsUsableIfRepoDoesntExist (2.620s)
2023/08/17 17:19:38 [INFO]  TestAccSyncEnsureRemotePathIsUsableIfRepoExists (5.460s)
2023/08/17 17:19:40 [INFO]  TestAccSyncEnsureRemotePathIsUsableInWorkspace (1.850s)
2023/08/17 17:19:40 [INFO]  TestAccWorkspaceList (0.780s)
2023/08/17 17:19:51 [INFO]  TestAccExportDir (10.350s)
2023/08/17 17:19:54 [INFO]  TestAccExportDirDoesNotOverwrite (3.330s)
2023/08/17 17:19:58 [INFO]  TestAccExportDirWithOverwriteFlag (3.770s)
2023/08/17 17:20:07 [INFO]  TestAccImportDir (9.320s)
2023/08/17 17:20:24 [INFO]  TestAccImportDirDoesNotOverwrite (16.950s)
2023/08/17 17:20:35 [INFO]  TestAccImportDirWithOverwriteFlag (10.620s)
2023/08/17 17:20:35 [INFO]  68/68 passed, 0 failed, 3 skipped
```
2023-08-18 08:07:25 +00:00
Andrew Nester 56dcd3f0a7
Renamed `environments` to `targets` in bundle configuration (#670)
## Changes
Renamed Environments to Targets in bundle.yml.

The change is backward-compatible and customers can continue to use
`environments` in the time being.

## Tests
Added tests which checks that both `environments` and `targets` sections
in bundle.yml works correctly
2023-08-17 15:22:32 +00:00
shreyas-goenka 61b103318f
Use custom prompter for bundle template inputs (#663)
## Changes
Prompt UI glitches often. We are switching to a custom implementation of
a simple prompter which is much more stable.
This also allows new lines in prompts which has been an ask by the
mlflow team.

## Tests
Tested manually
2023-08-15 14:50:20 +00:00
Andrew Nester 5cdaacacc3
Locked terraform binary version to <= 1.5.5 (#666)
## Changes
Locked terraform binary version to <= 1.5.5
2023-08-15 13:39:32 +00:00
shreyas-goenka 6430d23453
Print y/n options when displaying prompts using cmdio.Ask (#650)
## Changes
Adds `[y/n]` in `cmdio.Ask` to make the options obvious in all question
prompts

## Tests
Test manually. Works.
2023-08-09 09:22:42 +00:00
Lennart Kats (databricks) d55652be07
Extend deployment mode support (#577)
## Changes

This adds `mode: production` option. This mode doesn't do any
transformations but verifies that an environment is configured correctly
for production:

```
environments:
  prod:
    mode: production

    # paths should not be scoped to a user (unless a service principal is used)
    root_path: /Shared/non_user_path/...

    # run_as and permissions should be set at the resource level (or at the top level when that is implemented)
    run_as:
      user_name: Alice
    permissions:
    - level: CAN_MANAGE
      user_name: Alice
```

Additionally, this extends the existing `mode: development` option,
* now prefixing deployed assets with `[dev your.user]` instead of just
`[dev`]
* validating that development deployments _are_ scoped to a user

## Related

https://github.com/databricks/cli/pull/578/files (in draft)

## Tests

Manual testing to validate the experience, error messages, and
functionality with all resource types. Automated unit tests.

---------

Co-authored-by: Fabian Jakobs <fabian.jakobs@databricks.com>
2023-07-30 07:19:49 +00:00
dependabot[bot] 65d8fe13e9
Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 (#585)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.12.0 to 0.13.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.13.0</h2>
<ul>
<li>Add issue templates (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li>
<li>Added HasRequiredNonBodyField method (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li>
<li>Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li>
<li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in
OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li>
<li>Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li>
<li>Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li>
<li>Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li>
<li>Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Removed <code>Maintenance</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>EnableOptimization</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a>
workspace-level service.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>JobSource</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunJobOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>NotificationSettings</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.13.0</h2>
<ul>
<li>Add issue templates (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/539">#539</a>).</li>
<li>Added HasRequiredNonBodyField method (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/536">#536</a>).</li>
<li>Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/544">#544</a>).</li>
<li>Refactor Handling of Name<!-- raw HTML omitted -->ID Mapping in
OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/547">#547</a>).</li>
<li>Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/549">#549</a>).</li>
<li>Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/550">#550</a>).</li>
<li>Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/554">#554</a>).</li>
<li>Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/540">#540</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Removed <code>Maintenance</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>EnableOptimization</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TablesAPI">w.Tables</a>
workspace-level service.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountMetastoreRequest">catalog.DeleteAccountMetastoreRequest</a>.</li>
<li>Added <code>Force</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteAccountStorageCredentialRequest">catalog.DeleteAccountStorageCredentialRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenance">catalog.UpdateAutoMaintenance</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateAutoMaintenanceResponse">catalog.UpdateAutoMaintenanceResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimization">catalog.UpdatePredictiveOptimization</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdatePredictiveOptimizationResponse">catalog.UpdatePredictiveOptimizationResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateTableRequest">catalog.UpdateTableRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PartialUpdate">iam.PartialUpdate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#PatchSchema">iam.PatchSchema</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>JobSource</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GitSource">jobs.GitSource</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>TriggerInfo</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunJobOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>EmailNotifications</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>NotificationSettings</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>Health</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>RunJobTask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Task">jobs.Task</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>OnDurationWarningThresholdExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSource">jobs.JobSource</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSourceDirtyState">jobs.JobSourceDirtyState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthMetric">jobs.JobsHealthMetric</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthOperator">jobs.JobsHealthOperator</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRule">jobs.JobsHealthRule</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobsHealthRules">jobs.JobsHealthRules</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobOutput">jobs.RunJobOutput</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunJobTask">jobs.RunJobTask</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b4fb746b3b"><code>b4fb746</code></a>
Release v0.13.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/555">#555</a>)</li>
<li><a
href="180c7eea4c"><code>180c7ee</code></a>
Added SCIM Patch Acceptance Tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/540">#540</a>)</li>
<li><a
href="546814a272"><code>546814a</code></a>
Prepare for auto-releaser infra (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/554">#554</a>)</li>
<li><a
href="7e680c5ba8"><code>7e680c5</code></a>
Parse Camel Case and Pascal Case Enum Values (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/550">#550</a>)</li>
<li><a
href="e71ece4ccd"><code>e71ece4</code></a>
Bump google.golang.org/api from 0.130.0 to 0.131.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/551">#551</a>)</li>
<li><a
href="3b4492b6d6"><code>3b4492b</code></a>
Regenerate Go SDK from current OpenAPI Specification (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/549">#549</a>)</li>
<li><a
href="f84de6111a"><code>f84de61</code></a>
Refactor Handling of Name&lt;-&gt;ID Mapping in OpenAPI Generator (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/547">#547</a>)</li>
<li><a
href="c37a894872"><code>c37a894</code></a>
Bump google.golang.org/api from 0.129.0 to 0.130.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/542">#542</a>)</li>
<li><a
href="4f2aa38e75"><code>4f2aa38</code></a>
Bump golang.org/x/mod from 0.11.0 to 0.12.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/541">#541</a>)</li>
<li><a
href="e80f6e16ff"><code>e80f6e1</code></a>
Make Azure MSI auth account compatible (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/544">#544</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.12.0...v0.13.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.12.0&new-version=0.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Serge Smertin <serge.smertin@databricks.com>
2023-07-18 15:30:00 +00:00
Gleb Kanterov 179154477e
Propagate TF_CLI_CONFIG_FILE env variable (#555)
## Changes
Propagate `TF_CLI_CONFIG_FILE` env variable.

From Terraform documentation:

> The location of the Terraform CLI configuration file can also be
specified using the TF_CLI_CONFIG_FILE [environment
variable](https://developer.hashicorp.com/terraform/cli/config/environment-variables)

It allows using custom builds of terraform-provider-databricks, using
config files like:

```tf
provider_installation {
  dev_overrides {
    "databricks/databricks" = "/Users/gleb.kanterov/terraform-provider-databricks"
  }

  direct {}
}
```

## Tests
I added unit tests.
2023-07-07 13:20:37 +02:00
Pieter Noordhuis ad8183d7a9
Bump Go SDK to v0.12.0 (#540)
## Changes

* Regenerate CLI commands
* Ignore `account-access-control-proxy` (see #505)

## Tests

Unit and integration tests pass.
2023-07-03 11:46:45 +02:00
shreyas-goenka 5d036ab6b8
Fix locker unlock for destroy (#492)
## Changes
Adds ability for allowing unlock to succeed even if the deploy file is
missing.
 
## Tests
Using integration tests and manually
2023-06-19 15:57:25 +02:00
Pieter Noordhuis 1875908b59
Pass through proxy related environment variables (#465)
## Changes

If set on the host, we must pass them through to Terraform.

## Tests

Unit tests pass.
2023-06-14 21:58:26 +02:00
Pieter Noordhuis 349e2aff40
Allow equivalence checking of filer errors to fs errors (#416)
## Changes

The pattern `errors.Is(err, fs.ErrNotExist)` is common to check for an
error type.

Errors can implement `Is(error) bool` with a custom equivalence checker.

## Tests

New asserts all pass in the integration test.
2023-05-31 20:47:00 +02:00
Pieter Noordhuis e4ab455ea1
Don't pass synthesized TMPDIR if not already set (#409)
## Changes

On Unix systems, the default of `/tmp` always works. No need to
synthesize a path for it.

The custom TMPDIR was causing issues when used from GitHub Actions
runners.

## Tests

Confirmed manually this fixes the issue on GitHub Actions runners.
2023-05-26 13:05:30 +02:00
Andrew Nester 6141476ca2
Added support for bundle.Seq, simplified Mutator.Apply interface (#403)
## Changes
Added support for `bundle.Seq`, simplified `Mutator.Apply` interface by
removing list of mutators from return values/

## Tests
1. Ran `cli bundle deploy` and interrupted it with Cmd + C mid execution
so lock is not released
2. Ran `cli bundle deploy` top make sure that CLI is not trying to
release lock when it fail to acquire it
```
andrew.nester@HFW9Y94129 multiples-tasks % cli bundle deploy
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/simple-task/development/files!

^C
andrew.nester@HFW9Y94129 multiples-tasks % cli bundle deploy
Error: deploy lock acquired by andrew.nester@databricks.com at 2023-05-24 12:10:23.050343 +0200 CEST. Use --force to override
```
2023-05-24 14:45:19 +02:00
shreyas-goenka c53ad860e6
Create tmp files in the cache dir in terraform command runs (#395)
## Changes
Passes through tmp dir related env vars to the terraform process. Incase
any of them are not set, we assign temp dir inside bundle cache dir as
the location terraform should use.

## Tests
Manually checked that these env vars do override location where
os.CreateTemp files are created
2023-05-23 13:51:15 +02:00
Pieter Noordhuis 98ebb78c9b
Rename bricks -> databricks (#389)
## Changes

Rename all instances of "bricks" to "databricks".

## Tests

* Confirmed the goreleaser build works, uses the correct new binary
name, and produces the right archives.
* Help output is confirmed to be correct.
* Output of `git grep -w bricks` is minimal with a couple changes
remaining for after the repository rename.
2023-05-16 18:35:39 +02:00
Andrew Nester 33fb0b3c40
Do not truncate local state file when pulling remote changes (#382)
## Changes
When local state file exists it won't be override by remote state file

## Tests
Running `bricks bundle deploy` after state push failed does not override
local state file

Use cases verified:
1. Local state file is newer than remote
2. Local state file is older than remote
3. Local state file does not exist
4. Local state file corrupted
2023-05-16 17:02:33 +02:00
Andrew Nester 473d2bf503
Improved error message when 'bricks bundle run' is executed before 'bricks bundle deploy' (#378)
## Changes
Improved error message when 'bricks bundle run' is executed before
'bricks bundle deploy'

The error happens when we attempt to load terraform state when it does
not exist.

The best way to check if terraform state actually exists is to call
`terraform show -json` and that's what already happens here

https://github.com/databricks/bricks/compare/main...error-before-deploy#diff-8c50f8c04e568397bc865b7e02d1f4ec5b18379d8d32daddfeb041035d804f5fL28

Absence of `state.Values` indicates that there is no state and likely
bundle was just never deployed.

## Tests
Ran `bricks bundle run test_job` on a new non-deployed bundle.

**Output:**

`Error: terraform show: No state. Did you forget to run 'bricks bundle
deploy'?`

Running `bricks bundle deploy && bricks bundle run test_job` succeeds.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-05-10 11:02:25 +02:00
Serge Smertin 9581187c9e
Update to Go SDK v0.8.0 (#351)
## Changes

- Update to Go SDK v0.8.0
- Fix all breaking changes

## Tests

- make test
2023-04-21 10:30:20 +02:00
shreyas-goenka 598ad62688
Log mutator messages using progress logger (#312)
This PR uses progress logger to log messages inside mutators
2023-04-18 16:55:06 +02:00
shreyas-goenka 85889dffb1
Move state to event for whether they support inplace progress logging (#339)
## Changes
Adds a IsInplaceSupported() function to the event interface. Any event
that now uses the progress logger has to declare whether they support in
place logging

## Tests
Manually
2023-04-18 14:20:35 +02:00
Pieter Noordhuis b388f4a0dc
Make all workspace paths string fields (#327)
## Changes

These are unlikely to ever be DBFS paths so we can remove this level of indirection to simplify.

**Note:** this is a breaking change. Downstream usage of these fields must be updated.

## Tests

Existing tests pass.
2023-04-12 16:54:36 +02:00
Miles Yucht 946906221d
Delete sync snapshots file when destroying a bundle (#323)
## Changes
This PR changes the files.Delete() mutator to delete the sync snapshots
file on destroy. This ensures that files will be uploaded when the
bundle is uploaded again.

## Tests
- [x] Manual test: Ran `bricks bundle destroy`, observed that the sync
snapshots file was deleted.
2023-04-11 16:57:01 +02:00
Pieter Noordhuis 42d29f92c9
Pass through $HOME when invoking Terraform (#319)
## Changes

This is useful when developing the Databricks Terraform provider where
you keep a local-only build of the provider and refer to it using $HOME
from `~/.terraformrc`, for example like this:

```
plugin_cache_dir = "$HOME/.terraform.d/plugin-cache"
```

## Tests

That $HOME is passed through cannot be tested as is because the
`tfexec.Terraform` struct doesn't expose it through public fields or
methods. What can be tested is a successful run of the initialize
mutator and this is included in this commit.
2023-04-11 13:11:31 +02:00
shreyas-goenka 4871f7bc8a
Add bundle destroy command (#300)
Adds bundle destroy capability to bricks
2023-04-06 12:54:58 +02:00
Pieter Noordhuis d7ac265536
Allow use of file library in pipeline (#308)
## Changes

This requires databricks/databricks-sdk-go#359.

## Tests

Tests pass and ran manual verification of deployment with files.
2023-04-05 16:29:42 +02:00
Pieter Noordhuis 04e77102c9
Add mutators to pull and push Terraform state (#288)
## Changes

Pull state before deploying and push state after deploying.

Note: the run command was missing mutators to initialize Terraform. This
is necessary if the cache directory is removed between running "deploy"
and "run" (which is valid now that we synchronize state).

## Tests

Manually.
2023-03-30 12:01:09 +02:00
Pieter Noordhuis 0ea0e81c8a
Ignore databricks_permissions resource when loading Terraform state (#291)
## Changes

The databricks_permissions resource may be generated if a bundle
resource includes a `permissions` block. There's no need to incorporate
details from the materialization into the bundle configuration struct.

## Tests

Confirmed that this fixes `bricks bundle run` when dealing with a bundle
with permission configuration.
2023-03-29 21:14:52 +02:00
Pieter Noordhuis 87207bba78
Configure Terraform provider auth through env vars (#290)
## Changes

Auth relied on setting a profile. In this change we enumerate all
configuration properties and export all non-empty ones as a map with
environment variables. We then pass this map to the Terraform execution
wrapper.

This results in Terraform using the bundle's authentication
configuration.

This change is needed to make #287 work.

## Tests

Manually.
2023-03-29 20:46:09 +02:00
Pieter Noordhuis 123a5e15e9
Acquire lock prior to deploy (#270)
Add configuration:

```
bundle:
  lock:
    enabled: true
    force: false
```

The force field can be set by passing the `--force` argument to `bricks
bundle deploy`. Doing so means the deployment lock is acquired even if
it is currently held. This should only be used in exceptional cases
(e.g. a previous deployment has failed to release the lock).
2023-03-22 16:37:26 +01:00
Pieter Noordhuis 66ca9ec266
Add permissions block to each resource (#264)
Example:

```yaml
resources:
  jobs:
    my_job:
      name: "[${bundle.environment}] My job"
      permissions:
        - level: CAN_VIEW
          group_name: users
```
2023-03-21 10:58:16 +01:00
Pieter Noordhuis 58563b1ea9
Add resources for mlflow models and experiments (#263)
Manually confirmed that both can be deployed.
2023-03-20 21:28:43 +01:00
Pieter Noordhuis ad666ff796
Use new logger throughout codebase (#256) 2023-03-17 15:17:31 +01:00
Pieter Noordhuis a0ed02281d
Execute file synchronization on deploy (#211)
1. Perform file synchronization on deploy
2. Update notebook file path translation logic to point to the
synchronization target rather than treating the notebook as an artifact
and uploading it separately.
2023-02-20 19:42:55 +01:00
Pieter Noordhuis 414ea4f891
Bump databricks-sdk-go to 0.3.2 (#215) 2023-02-20 16:00:20 +01:00
Pieter Noordhuis 35243db33c
Automatically install Terraform if needed (#141)
Users can opt out and use the system-installed version with the
following configuration:

```
bundle:
  terraform:
    exec_path: terraform
```

This will find the binary in $PATH and replace it with the found value.

If this is not set, the initialize phase will install Terraform in the
bundle's cache directory.
2022-12-15 17:30:33 +01:00
Pieter Noordhuis b111416fe5
Add `bricks bundle run` command (#134) 2022-12-15 15:12:47 +01:00
Pieter Noordhuis 72e89bf33c
Use pointers to resources in bundle configuration (#140)
Avoid copy-by-value when iterating over these maps.
2022-12-15 13:00:41 +01:00
Pieter Noordhuis d713521d63
Convert job task libraries to TF JSON (#132) 2022-12-12 16:36:59 +01:00
Pieter Noordhuis 8640696b4b
Add minimal test for conversion to TF JSON format (#130) 2022-12-12 11:31:28 +01:00