>>> cat databricks.yml bundle: name: wheel-task workspace: root_path: "~/.bundle/[UNIQUE_NAME]" include: - empty.yml resources: jobs: some_other_job: name: "[${bundle.target}] Test Wheel Job [UNIQUE_NAME]" tasks: - task_key: TestTask new_cluster: num_workers: 1 spark_version: 13.3.x-snapshot-scala2.12 node_type_id: [NODE_TYPE_ID] data_security_mode: USER_ISOLATION instance_pool_id: [TEST_INSTANCE_POOL_ID] python_wheel_task: package_name: my_test_code entry_point: run parameters: - "one" - "two" libraries: - whl: ./dist/*.whl >>> [CLI] bundle deploy Building python_artifact... Uploading my_test_code-0.0.1-py3-none-any.whl... Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files... Deploying resources... Updating deployment state... Deployment complete! >>> [CLI] bundle run some_other_job Run URL: [DATABRICKS_URL]/?o=[NUMID]#job/[NUMID]/run/[NUMID] [TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" RUNNING [TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" TERMINATED SUCCESS Hello from my func Got arguments: ['my_test_code', 'one', 'two'] === Make a change to code without version change and run the job again >>> [CLI] bundle deploy Building python_artifact... Uploading my_test_code-0.0.1-py3-none-any.whl... Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files... Deploying resources... Updating deployment state... Deployment complete! >>> [CLI] bundle run some_other_job Run URL: [DATABRICKS_URL]/?o=[NUMID]#job/[NUMID]/run/[NUMID] [TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" RUNNING [TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" TERMINATED SUCCESS UPDATED MY FUNC Got arguments: ['my_test_code', 'one', 'two'] >>> [CLI] bundle destroy --auto-approve The following resources will be deleted: delete job some_other_job All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME] Deleting files... Destroy complete!