mirror of https://github.com/databricks/cli.git
913e10a037
## Changes Now it's possible to configure new `app` resource in bundle and point it to the custom `source_code_path` location where Databricks App code is defined. On `databricks bundle deploy` DABs will create an app. All consecutive `databricks bundle deploy` execution will update an existing app if there are any updated On `databricks bundle run <my_app>` DABs will execute app deployment. If the app is not started yet, it will start the app first. ### Bundle configuration ``` bundle: name: apps variables: my_job_id: description: "ID of job to run app" lookup: job: "My Job" databricks_name: description: "Name for app user" additional_flags: description: "Additional flags to run command app" default: "" my_app_config: type: complex description: "Configuration for my Databricks App" default: command: - flask - --app - hello - run - ${var.additional_flags} env: - name: DATABRICKS_NAME value: ${var.databricks_name} resources: apps: my_app: name: "anester-app" # required and has to be unique description: "My App" source_code_path: ./app # required and points to location of app code config: ${var.my_app_config} resources: - name: "my-job" description: "A job for app to be able to run" job: id: ${var.my_job_id} permission: "CAN_MANAGE_RUN" permissions: - user_name: "foo@bar.com" level: "CAN_VIEW" - service_principal_name: "my_sp" level: "CAN_MANAGE" targets: dev: variables: databricks_name: "Andrew (from dev)" additional_flags: --debug prod: variables: databricks_name: "Andrew (from prod)" ``` ### Execution 1. `databricks bundle deploy -t dev` 2. `databricks bundle run my_app -t dev` **If app is started** ``` ✓ Getting the status of the app my-app ✓ App is in RUNNING state ✓ Preparing source code for new app deployment. ✓ Deployment is pending ✓ Starting app with command: flask --app hello run --debug ✓ App started successfully You can access the app at <app-url> ``` **If app is not started** ``` ✓ Getting the status of the app my-app ✓ App is in UNAVAILABLE state ✓ Starting the app my-app ✓ App is starting... .... ✓ App is starting... ✓ App is started! ✓ Preparing source code for new app deployment. ✓ Downloading source code from /Workspace/Users/... ✓ Starting app with command: flask --app hello run --debug ✓ App started successfully You can access the app at <app-url> ``` ## Tests Added unit and config tests + manual test. ``` --- PASS: TestAccDeployBundleWithApp (404.59s) PASS coverage: 36.8% of statements in ./... ok github.com/databricks/cli/internal/bundle 405.035s coverage: 36.8% of statements in ./... ``` |
||
---|---|---|
.. | ||
assumptions | ||
bundle | ||
cmd | ||
internal/acc | ||
libs | ||
python | ||
README.md |
README.md
Integration tests
This directory contains integration tests for the project.
The tree structure generally mirrors the source code tree structure.
Requirements for new files in this directory:
- Every package must be named after its directory with
_test
appended- Requiring a different package name for integration tests avoids aliasing with the main package.
- Every integration test package must include a
main_test.go
file.
These requirements are enforced by a unit test in this directory.
Running integration tests
Integration tests require the following environment variables:
CLOUD_ENV
- set to the cloud environment to use (e.g.aws
,azure
,gcp
)DATABRICKS_HOST
- set to the Databricks workspace to useDATABRICKS_TOKEN
- set to the Databricks token to use
Optional environment variables:
TEST_DEFAULT_WAREHOUSE_ID
- set to the default warehouse ID to useTEST_METASTORE_ID
- set to the metastore ID to useTEST_INSTANCE_POOL_ID
- set to the instance pool ID to useTEST_BRICKS_CLUSTER_ID
- set to the cluster ID to use
To run all integration tests, use the following command:
go test ./integration/...
Alternatively:
make integration