databricks-cli/acceptance/bundle/integration_whl/wrapper/output.txt

71 lines
1.8 KiB
Plaintext

>>> cat input.json
{
"project_name": "my_test_code",
"spark_version": "12.2.x-scala2.12",
"node_type_id": "[NODE_TYPE_ID]",
"unique_id": "[UNIQUE_NAME]",
"python_wheel_wrapper": true,
"instance_pool_id": "[TEST_INSTANCE_POOL_ID]"
}
✨ Successfully initialized template
>>> cat databricks.yml
bundle:
name: wheel-task
workspace:
root_path: "~/.bundle/[UNIQUE_NAME]"
experimental:
python_wheel_wrapper: true
resources:
jobs:
some_other_job:
name: "[${bundle.target}] Test Wheel Job [UNIQUE_NAME]"
tasks:
- task_key: TestTask
new_cluster:
num_workers: 1
spark_version: "12.2.x-scala2.12"
node_type_id: "[NODE_TYPE_ID]"
data_security_mode: USER_ISOLATION
instance_pool_id: "[TEST_INSTANCE_POOL_ID]"
python_wheel_task:
package_name: my_test_code
entry_point: run
parameters:
- "one"
- "two"
libraries:
- whl: ./dist/*.whl
>>> [CLI] bundle deploy
Building python_artifact...
Uploading my_test_code-0.0.1-py3-none-any.whl...
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
Deploying resources...
Updating deployment state...
Deployment complete!
>>> [CLI] bundle run some_other_job
Run URL: [DATABRICKS_URL]/?o=[NUMID]#job/[NUMID]/run/[NUMID]
[TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" RUNNING
[TIMESTAMP] "[default] Test Wheel Job [UNIQUE_NAME]" TERMINATED SUCCESS
Hello from my func
Got arguments:
['my_test_code', 'one', 'two']
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete job some_other_job
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]
Deleting files...
Destroy complete!