Commit Graph

268 Commits

Author SHA1 Message Date
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Ilia Babanov 9c3e4fda7c
Add "bundle summary" command (#1123)
The plan is to use the new command in the Databricks VSCode extension to
render "modified" UI state in the bundle resource tree elements, plus
use resource IDs to generate links for the resources

### New revision
- Renamed `remote-state` to `summary`
- Added "modified statuses" to all resources. Currently we don't set
"updated" status - it's either nothing, or created/deleted
- Added tests for the `TerraformToBundle` command
2024-01-25 11:32:47 +00:00
shreyas-goenka cf2a1c38ba
Set run_as permissions after variable interpolation (#1141)
## Changes

This PR sets run as permissions after variable interpolation.

Terraform does not allow specifying permissions for current user.

The following configuration would fail becuase we would assign a
permission block for self, bypassing this check here:
4ee926b885/bundle/config/mutator/run_as.go (L47)

```
run_as:
  user_name: ${workspace.current_user.userName}
```



## Tests
Manually, setting run_as to ${workspace.current_user.userName} works now
2024-01-24 12:22:04 +00:00
Andrew Nester 1b6241746e
Use MockWorkspaceClient from SDK instead of WithImpl mocking (#1134)
## Changes
Use MockWorkspaceClient from SDK instead of WithImpl mocking
2024-01-19 14:12:58 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Andrew Nester ef67b1755e
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes
Do not require positional arguments if they should be provided in JSON

Fixes #1122
2024-01-17 10:53:50 +00:00
Pieter Noordhuis 06b50670e1
Support passing job parameters to bundle run (#1115)
## Changes

This change adds support for job parameters. If job parameters are
specified for a job that doesn't define job parameters it returns an
error. Conversely, if task parameters are specified for a job that
defines job parameters, it also returns an error.

This change moves the options structs and their functions to separate
files and backfills test coverage for them.

Job parameters can now be specified with `--params foo=bar,bar=qux`.

## Tests

Unit tests and manual integration testing.
2024-01-15 07:42:36 +00:00
Pieter Noordhuis 3c76a11d00
Upgrade Go SDK to v0.29.0 (#1111)
## Changes

See:
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.29.0
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.28.0

## Tests

Unit and integration tests pass.
2024-01-11 08:16:25 +00:00
Pieter Noordhuis f5c46478f4
Upgrade golang.org/x/crypto to v0.17.0 in internal module (#1110)
## Changes

This addresses https://github.com/databricks/cli/security/dependabot/12.
2024-01-10 13:53:01 +00:00
Andrew Nester 4b01fff03d
Fixed instance pool resolving by name (#1102)
## Changes
Fixed instance pool resolving by name

## Tests
Added regression test
2024-01-05 10:50:53 +00:00
Andrew Nester 5fb40f9d07
Allow referencing bundle resources by name (#872)
## Changes
Now we can define variables with values which reference different
Databricks resources by name.
When references like this, DABs automatically looks up the resource by
this name and replaces the reference with ID of the resource referenced.
Thus when the variable is used in the configuration it will contain the
correct resolved ID of resource.

The resolvers are code generated and thus DABs support referencing all
resources which has `GetByName`-like methods in Go SDK.

### Example

```
variables:
  my_cluster_id:
    description: An existing cluster.
    lookup: 
      cluster: "12.2 shared"

resources:
  jobs:
    my_job:
      name: "My Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: ${var.my_cluster_id}

targets:
  dev:
    variables:
      my_cluster_id:
        lookup: 
           cluster: "dev-cluster"
```

## Tests
Added unit test + manual testing

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-01-04 21:04:42 +00:00
Lennart Kats (databricks) 167deec8c3
Change recommended production deployment path from /Shared to /Users (#1091)
## Changes

This PR changes the default and `mode: production` recommendation to
target `/Users` for deployment. Previously, we used `/Shared`, but
because of a lack of POSIX-like permissions in WorkspaceFS this meant
that files inside would be readable and writable by other users in the
workspace.

Detailed change:
* `default-python` no longer uses a path that starts with `/Shared`
* `mode: production` no longer requires a path that starts with
`/Shared`
 
## Related PRs

Docs: https://github.com/databricks/docs/pull/14585
Examples: https://github.com/databricks/bundle-examples/pull/17

## Tests

* Manual tests
* Template unit tests (with an extra check to avoid /Shared)
2024-01-02 19:58:24 +00:00
Lennart Kats (databricks) 9a1f078bd9
Improve error when bundle root is not writable (#1093)
## Changes

This improves the error when deploying to a bundle root that the current
user doesn't have write access to. This can come up slightly more often
since the change of https://github.com/databricks/cli/pull/1091.

Before this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: no such directory: /Users/lennart.kats@databricks.com/.bundle/my_project/prod/state
```

After this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: cannot write to deployment root (this can indicate a previous deploy was done with a different identity): /Users/lennart.kats@databricks.com/.bundle/my_project/prod
```

Note that this change uses the "no such directory" error returned from
the filer.
2023-12-28 13:15:21 +00:00
Pieter Noordhuis fa3c8b1017
Use resource key as name in permissions code (#1087)
## Changes

The code relied on the `Name` property being accessible for every
resource. This is generally true, but because these property structs are
embedded as pointer, they can be nil. This is also why the tests had to
initialize the embedded struct to pass. This changes the approach to use
the keys from the resource map instead, so that we no longer rely on the
non-nil embedded struct.

Note: we should evaluate whether we should turn these into values
instead of pointers. I don't recall if we get value from them being
pointers.

## Tests

Unit tests pass.
2023-12-22 14:45:53 +00:00
Andrew Nester ac37a592f1
Added exec.NewCommandExecutor to execute commands with correct interpreter (#1075)
## Changes
Instead of handling command chaining ourselves, we execute passed
commands as-is by storing them, in temp file and passing to correct
interpreter (bash or cmd) based on OS.

Fixes #1065 

## Tests
Added unit tests
2023-12-21 15:45:23 +00:00
Lennart Kats (databricks) 875c9d2db1
Tune output of bundle deploy command (#1047)
## Changes

Update the output of the `deploy` command to be more concise and
consistent:
```
$ databricks bundle deploy
Building my_project...
Uploading my_project-0.0.1+20231207.205106-py3-none-any.whl...
Uploading bundle files to /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

This does away with the intermediate success messages, makes consistent
use of `...`, and only prints the success message at the very end after
everything is completed.

Below is the original output for comparison:

```
$ databricks bundle deploy
Detecting Python wheel project...
Found Python wheel project at /tmp/output/my_project
Building my_project...
Build succeeded
Uploading my_project-0.0.1+20231207.205134-py3-none-any.whl...
Upload succeeded
Starting upload of bundle files
Uploaded bundle files at /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files!

Starting resource deployment
Resource deployment completed!
```
2023-12-21 08:00:37 +00:00
shreyas-goenka 2d93f62f21
Set metadata fields required to enable break-glass UI for jobs (#880)
## Changes

This PR sets the following fields for all jobs that are deployed from a
DAB
1. `deployment`: This provides the platform with the path to a file to
read the metadata from.
2. `edit_mode`: This tells the platform to display the break-glass UI
for jobs deployed from a DAB. Setting this is required to re-lock the UI
after a user clicks "disconnect from source".
3. `format = MULTI_TASK`. This makes the Terraform provider always use
jobs API 2.1 for creating/updating the job. Required because
`deployment` and `edit_mode` are only available in API 2.1.

## Tests

Unit test and manually. Manually verified that deployments trigger the
break glass UI. Manually verified there is no Terraform drift when all
three fields are set.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-12-19 07:38:52 +00:00
Pieter Noordhuis cee70a53c8
Test existing behavior when loading non-string spark conf values (#1071)
## Changes

This test is expected to fail when we enable the custom YAML loader.
2023-12-18 11:22:22 +00:00
Andrew Nester a6ec9ac08b
Upgrade Go SDK to 0.27.0 (#1064)
## Changes
Upgrade Go SDK to 0.27.0
2023-12-14 08:15:00 +00:00
Pieter Noordhuis 37671d9f54
Fix passthrough of pipeline notifications (#1058)
## Changes

Notifications weren't passed along because of a plural vs singular
mismatch.

## Tests

* Added unit test coverage.
* Manually confirmed it now works in an example bundle.
2023-12-12 11:36:06 +00:00
shreyas-goenka b479a7cf67
Upgrade Terraform schema version to v1.31.1 (#1055)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-12-11 21:53:21 +00:00
shreyas-goenka 6002f49c87
Move bundle schema update to an internal module (#1012)
## Changes

This PR:
1. Move code to load bundle JSON Schema descriptions from the OpenAPI
spec to an internal Go module
2. Remove command line flags from the `bundle schema` command. These
flags were meant for internal processes and at no point were meant for
customer use.
3. Regenerate `bundle_descriptions.json`
4. Add support for `bundle: "deprecated"`. The `environments` field is
tagged as deprecated in this PR and consequently will no longer be a
part of the bundle schema.

## Tests
Tested by regenerating the CLI against its current OpenAPI spec (as
defined in `__openapi_sha`). The `bundle_descriptions.json` in this PR
was generated from the code generator.

Manually checked that the autocompletion / descriptions from the new
bundle schema are correct.
2023-12-06 10:45:18 +00:00
Andrew Nester 83d50001fc
Pass parameters to task when run with `--python-params` and `python_wheel_wrapper` is true (#1037)
## Changes
It makes the behaviour consistent with or without `python_wheel_wrapper`
on when job is run with `--python-params` flag.

In `python_wheel_wrapper` mode it converts dynamic `python_params` in a
dynamic specially named `notebook_param` and the wrapper reads them with
`dbutils` and pass to `sys.argv`

Fixes #1000

## Tests
Added an integration test.

Integration tests pass.
2023-12-01 10:35:20 +00:00
shreyas-goenka 677926b78b
Fix panic when bundle auth resolution fails (#1002)
## Changes
CLI would panic if an invalid bundle auth is setup when running CLI
commands. This PR removes the panic and shows the error message directly
instead.

## Tests
The CWD is a bundle with:
```
workspace:
  profile: DEFAULT
```

Before:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
panic: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT

goroutine 1 [running]:
```

After:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
Error: cannot resolve bundle auth configuration: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT
```

```
shreyas.goenka@THW32HFW6T bundle-playground % DATABRICKS_CONFIG_FILE=/dev/null cli bundle deploy
Error:  cannot resolve bundle auth configuration: resolve: /dev/null has no DEFAULT profile configured. Config: profile=DEFAULT, config_file=/dev/null. Env: DATABRICKS_CONFIG_FILE
```
2023-11-30 14:28:01 +00:00
Andrew Nester 4d8d825746
Fixed panic when job has trigger and in development mode (#1026)
## Changes
Fixed panic when job has trigger and in development mode
2023-11-29 16:32:42 +00:00
Andrew Nester 833746cbdd
Do not replace pipeline libraries if there are no matches for pattern (#1021)
## Changes
If there are no matches when doing Glob call for pipeline library
defined, leave the entry as is.
The next mutators in the chain will detect that file is missing and the
error will be more user friendly.


Before the change

```
Starting resource deployment
Error: terraform apply: exit status 1

Error: cannot create pipeline: libraries must contain at least one element
```

After

```
Error: notebook ./non-existent not found
```


## Tests
Added regression unit tests
2023-11-29 13:20:13 +00:00
Andrew Nester 5431174302
Do not add wheel content hash in uploaded Python wheel path (#1015)
## Changes
Removed hash from the upload path since it's not useful anyway.

The main reason for that change was to make it work on all-purpose
clusters. But in order to make it work, wheel version needs to be
increased anyway. So having only hash in path is useless.

Note: using --build-number (build tag) flag does not help with
re-installing libraries on all-purpose clusters. The reason is that
`pip` ignoring build tag when upgrading the library and only look at
wheel version.
Build tag is only used for sorting the versions and the one with higher
build tag takes priority when installed. It only works if no library is
installed.
See
a15dd75d98/src/pip/_internal/index/package_finder.py (L522-L556)
https://github.com/pypa/pip/issues/4781

Thus, the only way to reinstall the library on all-purpose cluster is to
increase wheel version manually or use automatic version generation,
f.e.
```
setup(
  version=datetime.datetime.utcnow().strftime("%Y%m%d.%H%M%S"),
  ...
)
```

## Tests
Integration tests passed.
2023-11-29 10:40:12 +00:00
Pieter Noordhuis 6187803007
Correctly overwrite local state if remote state is newer (#1008)
## Changes

A bug in the code that pulls the remote state could cause the local
state to be empty instead of a copy of the remote state. This happened
only if the local state was present and stale when compared to the
remote version.

We correctly checked for the state serial to see if the local state had
to be replaced but didn't seek back on the remote state before writing
it out. Because the staleness check would read the remote state in full,
copying from the same reader would immediately yield an EOF.

## Tests

* Unit tests for state pull and push mutators that rely on a mocked
filer.
* An integration test that deploys the same bundle from multiple paths,
triggering the staleness logic.

Both failed prior to the fix and now pass.
2023-11-24 11:15:46 +00:00
Andrew Nester 48e293c72c
Pass `USERPROFILE` environment variable to Terraform (#1001)
## Changes
It appears that `USERPROFILE` env variable indicates where Azure CLI
stores configuration data (aka `.azure` folder).

https://learn.microsoft.com/en-us/cli/azure/azure-cli-configuration#cli-configuration-file

Passing it to terraform executable allows it to correctly authenticate
using Azure CLI.

Fixes #983 

## Tests
Ran deployment on Window VM before and after the fix.
2023-11-22 09:16:28 +00:00
Andrew Nester fa89db57e9
Enable `spark_jar_task` with local JAR libraries (#993)
## Changes
Previously local JAR paths were transformed to remote path during
initialisation and thus artifact building logic did not recognise such
libraries as local to be handled and uploaded.

Now it's possible to use spark_jar_tasks with local JAR libraries on
14.1+ DBR clusters

Example configuration
```
bundle:
  name: spark-jar

workspace:
  host: ***

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: "/Users/andrew.nester/dabs/wheel/sample-java/PrintArgs.jar"

resources:
  jobs:
    print_args:
      name: "Print Args"
      tasks:
        - task_key: Print
          new_cluster:
            num_workers: 0
            spark_version: 14.2.x-scala2.12
            node_type_id: i3.xlarge
            spark_conf:
              "spark.databricks.cluster.profile": "singleNode"
              "spark.master": "local[*]"
            custom_tags:
              ResourceClass: "SingleNode"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```

## Tests
Manually running `bundle deploy and bundle run`
2023-11-21 10:15:09 +00:00
Pieter Noordhuis 489d6fa1b8
Replace direct calls with `bundle.Apply` (#990)
## Changes

Some test call sites called directly into the mutator's `Apply` function
instead of `bundle.Apply`. Calling into `bundle.Apply` is preferred
because that's where we can run pre/post logic common across all
mutators.

## Tests

Pass.
2023-11-15 14:19:18 +00:00
Pieter Noordhuis d80c35f66a
Rename variable `bundle -> b` (#989)
## Changes

All calls to apply a mutator must go through `bundle.Apply`. This
conflicts with the existing use of the variable `bundle`. This change
un-aliases the variable from the package name by renaming all variables
to `b`.

## Tests

Pass.
2023-11-15 14:03:36 +00:00
shreyas-goenka 0c837e5772
Make `file_path` and `artifact_path` fields consistent with json tag (#987)
## Changes
This PR:
1. Renames `FilesPath` -> `FilePath` and `ArtifactsPath` ->
`ArtifactPath` in the bundle and metadata configuration to make them
consistant with the json tags.
2. Fixes development / production mode error messages to point to
`file_path` and `artifact_path`

## Tests
Existing unit tests. This is a strightforward renaming of the fields.
2023-11-15 13:37:26 +00:00
shreyas-goenka 0f58f6c875
Serialise empty files_path and job.relative_path in the deployment metadata (#984)
## Changes
The Jobs service expects these fields to always be present in the
metadata in their validation logic, which is reasonable. This PR removes
the omit empty tags so these fields are always uploaded to the workspace
`metadata.json` file.
2023-11-14 16:28:32 +00:00
Lennart Kats (databricks) 0ab125c109
Allow jobs to be manually unpaused in development mode (#885)
Partly mitigates #859. It's still not clear to me if there is an actual
use case or if users are trying to use "development" mode jobs for
production, but making this overridable is reasonable.

Beyond this fix I think we could do something in the Jobs schedule UI,
but it would help to better understand the use case (or actual reason of
confusion). I expect we should hint customers to move away from dev mode
rather than unpause.
2023-11-13 19:50:39 +00:00
Andrew Nester f3db42e622
Added support for top-level permissions (#928)
## Changes
Now it's possible to define top level `permissions` section in bundle
configuration and permissions defined there will be applied to all
resources defined in the bundle.

Supported top-level permission levels: CAN_MANAGE, CAN_VIEW, CAN_RUN.

Permissions are applied to: Jobs, DLT Pipelines, ML Models, ML
Experiments and Model Service Endpoints

```
bundle:
  name: permissions

workspace:
  host: ***

permissions:
  - level: CAN_VIEW
    group_name: test-group
  - level: CAN_MANAGE
    user_name: user@company.com
  - level: CAN_RUN
    service_principal_name: 123456-abcdef
```

## Tests
Added corresponding unit tests + ran `bundle validate` and `bundle
deploy` manually
2023-11-13 11:29:40 +00:00
Pieter Noordhuis 7847388f95
Initialize variable definitions that are defined without properties (#966)
## Changes

We can debate whether or not variable definitions without properties are
valid, but in no case should this panic the CLI.

Fixes #934.

## Tests

Unit.
2023-11-08 11:01:14 +00:00
Michał Szafrański 10291b0e13
Bundle path rewrites for dbt and SQL file tasks (#962)
## Changes
Support path rewrites for Dbt and SQL file job taks.
<!-- Summary of your changes that are easy to understand -->

## Tests
* Added unit test
<!-- How is this tested? -->
2023-11-07 20:00:09 +00:00
shreyas-goenka b6aa4631f1
Fix metadata computation for empty bundle (#939)
## Changes
This PR fixes metadata computation for empty bundle. Before we would
error because the `terraform.Load()` mutator errors on a empty / no
state file.

## Tests
Failing integration tests now pass.
2023-11-02 11:00:30 +00:00
shreyas-goenka 5a8cd0c5bc
Persist deployment metadata in WSFS (#845)
## Changes

This PR introduces a metadata struct that stores a subset of bundle
configuration that we wish to expose to other Databricks services that
wish to integrate with bundles.

This metadata file is uploaded to a file
`${bundle.workspace.state_path}/metadata.json` in the WSFS destination
of the bundle deployment.

Documentation for emitted metadata fields:
* `version`: Version for the metadata file schema
* `config.bundle.git.branch`: Name of the git branch the bundle was
deployed from.
* `config.bundle.git.origin_url`: URL for git remote "origin"
* `config.bundle.git.bundle_root_path`: Relative path of the bundle root
from the root of the git repository. Is set to "." if they are the same.
* `config.bundle.git.commit`: SHA-1 commit hash of the exact commit this
bundle was deployed from. Note, the deployment might not exactly match
this commit version if there are changes that have not been committed to
git at deploy time,
* `file_path`: Path in workspace where we sync bundle files to. 
* `resources.jobs.[job-ref].id`: Id of the job
* `resources.jobs.[job-ref].relative_path`: Relative path of the yaml
config file from the bundle root where this job was defined.

Example metadata object when bundle root and git root are the same:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "master",
        "origin_url": "www.host.com",
        "commit": "7af8e5d3f5dceffff9295d42d21606ccf056dce0",
        "bundle_root_path": "."
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```

Example metadata when the git root is one level above the bundle repo:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "dev-branch",
        "origin_url": "www.my-repo.com",
        "commit": "3db46ef750998952b00a2b3e7991e31787e4b98b",
        "bundle_root_path": "pipeline-progress"
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```


This unblocks integration to the jobs break glass UI for bundles.

## Tests
Unit tests and integration tests.
2023-10-27 12:55:43 +00:00
shreyas-goenka bb662fadbb
Bump Terraform provider to v1.29.0 (#926)
This PR:
1. Regenerates go structs using provider version 1.29
2. Adds QOL autogenerated diff labels for github
3. Adds a small SOP for doing the tf provider bump for go structs
2023-10-27 09:16:41 +00:00
Andrew Nester 6f22ae8696
Use UserName instead of Id to check if identity used is a service principal (#924)
## Changes
Use UserName instead of Id to check if identity used is a service
principal
2023-10-26 14:58:16 +00:00
Andrew Nester 19e00d2d47
Upload terraform state even if apply fails (#923)
## Changes
Upload terraform state even if apply fails

Fixes #893 

## Tests
Manually running `databricks bundle deploy` with incorrect permissions
in bundle config and observe that it gets uploaded correctly
2023-10-26 14:38:01 +00:00
Pieter Noordhuis 6e21ced54a
Consolidate bundle configuration loader function (#918)
## Changes

There were two functions related to loading a bundle configuration file;
one as a package function and one as a member function on the
configuration type. Loading the same configuration object twice doesn't
make sense and therefore we can consolidate to only using the package
function.

The package function would scan for known file names if the specified
path was a directory. This functionality was not in use because the
top-level bundle loader figures out the filename itself as of #580.

## Tests

Pass.
2023-10-25 12:55:56 +00:00
Pieter Noordhuis 486bf59627
Move bundle configuration filename code (#917)
## Changes

This is unrelated to the config root so belongs in a separate file (this
was added in #580).

## Tests

n/a
2023-10-25 09:54:39 +00:00
Lennart Kats (databricks) 9049f11479
Fix wheel task not working with with 13.x clusters (#898)
## Changes

This lets us recognize 13.x as "13.1 or higher," making it possible to
use wheel tasks on 13.x-snapshot clusters.
2023-10-23 08:19:26 +00:00
Pieter Noordhuis d4be40520c
Resolve configuration before performing verification (#890)
## Changes

If a bundle configuration specifies a workspace host, and the user
specifies a profile to use, we perform a check to confirm that the
workspace host in the bundle configuration and the workspace host from
the profile are identical. If they are not, we return an error. The
check was introduced in #571.

Previously, the code included an assumption that the client
configuration was already loaded from the environment prior to
performing the check. This was not the case, and as such if the user
intended to use a non-default path to `.databrickscfg`, this path was
not used when performing the check.

The fix does the following:
* Resolve the configuration prior to performing the check.
* Don't treat the configuration file not existing as an error.
* Add unit tests.

Fixes #884.

## Tests

Unit tests and manual confirmation.
2023-10-20 13:10:31 +00:00
Andrew Nester 7b1d972b33
Do not emit wheel wrapper error when python_wheel_wrapper setting is true (#894)
## Changes
Do not emit wheel wrapper error when python_wheel_wrapper setting is
true

Fixes #892 

## Tests
Added an regression test
2023-10-20 12:32:04 +00:00
Andrew Nester 5273d0c51a
Support Python wheels larger than 10MB (#879)
## Changes
Previously we only supported uploading Python wheels smaller than 10mb
due to using Workspace.Import API and `content ` field
https://docs.databricks.com/api/workspace/workspace/import

By switching to use `WorkspaceFilesClient` we overcome the limit because
it uses POST body for the API instead.

## Tests
`TestAccUploadArtifactFileToCorrectRemotePath` integration test passes

```
=== RUN   TestAccUploadArtifactFileToCorrectRemotePath
    artifacts_test.go:28: gcp
2023/10/17 15:24:04 INFO Using Google Credentials sdk=true
    helpers.go:356: Creating /Users/.../integration-test-wsfs-ekggbkcfdkid
artifacts.Upload(test.whl): Uploading...
2023/10/17 15:24:06 INFO Using Google Credentials mutator=artifacts.Upload(test) sdk=true
artifacts.Upload(test.whl): Upload succeeded
    helpers.go:362: Removing /Users/.../integration-test-wsfs-ekggbkcfdkid
--- PASS: TestAccUploadArtifactFileToCorrectRemotePath (5.66s)
PASS
coverage: 14.9% of statements in ./...
ok      github.com/databricks/cli/internal      6.109s  coverage: 14.9% of statements in ./...
```
2023-10-18 10:20:43 +00:00
Arpit Jasapara 24cc67563e
Support Unity Catalog Registered Models in bundles (#846)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add UC Registered Models support to Databricks Asset Bundles as new
resource `registered_model`. Also added UC Permission support via new
resource `grant`.

## Tests
<!-- How is this tested? -->
Tested via unit tests and manual testing with [example
PR](https://github.com/databricks/bundle-examples-internal/pull/80) and
[custom Terraform
provider](https://github.com/databricks/terraform-provider-databricks/pull/2771).
<img width="698" alt="Screenshot 2023-10-08 at 4 57 23 PM"
src="https://github.com/databricks/cli/assets/87999496/bcf605a9-7894-443b-865a-f7e240037815">
<img width="1109" alt="Screenshot 2023-10-08 at 4 56 47 PM"
src="https://github.com/databricks/cli/assets/87999496/e4d6e424-cd70-4809-8843-6939ed2e172f">
<img width="1091" alt="Screenshot 2023-10-08 at 4 56 57 PM"
src="https://github.com/databricks/cli/assets/87999496/88ebaabb-67db-4a11-88a5-df087e2e41c0">

---------

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
Co-authored-by: Andrew Nester <andrew.nester.dev@gmail.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-10-16 15:32:49 +00:00