Shreyas Goenka
4141f4ea34
-
2024-08-27 17:49:09 +02:00
Shreyas Goenka
aac6687900
remove more todos
2024-08-27 17:46:57 +02:00
Shreyas Goenka
40f4d35c4a
cleanup todos
2024-08-27 17:45:20 +02:00
Shreyas Goenka
ad7503a176
cleanup todos
2024-08-27 17:40:36 +02:00
Shreyas Goenka
f194a5b86a
skip adding patterns for string schemas
2024-08-27 17:36:12 +02:00
Shreyas Goenka
acc43092ad
fix lint
2024-08-27 14:17:04 +02:00
Shreyas Goenka
727036bb40
bundle schema command is OK now
2024-08-27 13:46:52 +02:00
Shreyas Goenka
483480f4a9
-
2024-08-27 13:39:23 +02:00
Shreyas Goenka
2c30cfa5a2
-
2024-08-27 13:38:42 +02:00
Shreyas Goenka
ac60163c19
move gen file
2024-08-27 13:35:56 +02:00
Shreyas Goenka
5b7974757d
-
2024-08-27 13:28:03 +02:00
Shreyas Goenka
e7fd063e8a
-
2024-08-27 13:10:13 +02:00
Shreyas Goenka
fcdccb359a
cleanu unused code
2024-08-27 12:47:49 +02:00
Shreyas Goenka
d07192fe03
merge
2024-08-27 11:54:41 +02:00
Shreyas Goenka
2dea889621
delete the old schema code
2024-08-27 11:53:29 +02:00
Shreyas Goenka
870e41912a
from_type method is mostly complete
2024-08-27 11:42:33 +02:00
dependabot[bot]
056d203236
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 ( #1719 )
...
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go )
from 0.44.0 to 0.45.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases ">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.45.0</h2>
<h2>0.45.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014 ">#1014</a>).</li>
<li>Do not specify <code>--tenant</code> flag when fetching managed
identity access token from the CLI (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021 ">#1021</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017 ">#1017</a>).</li>
<li>Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016 ">#1016</a>).</li>
<li>Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019 ">#1019</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI ">w.PolicyComplianceForClusters</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI ">w.PolicyComplianceForJobs</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI ">w.ResourceQuotas</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest ">catalog.GetQuotaRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse ">catalog.GetQuotaResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest ">catalog.ListQuotasRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse ">catalog.ListQuotasResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo ">catalog.QuotaInfo</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance ">compute.ClusterCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange ">compute.ClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest ">compute.EnforceClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse ">compute.EnforceClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest ">compute.GetClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse ">compute.GetClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest ">compute.ListClusterCompliancesRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse ">compute.ListClusterCompliancesResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange ">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest ">jobs.EnforcePolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse ">jobs.EnforcePolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest ">jobs.GetPolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse ">jobs.GetPolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance ">jobs.JobCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse ">jobs.ListJobComplianceForPolicyResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest ">jobs.ListJobComplianceRequest</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation ">catalog.CreateExternalLocation</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo ">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation ">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun ">jobs.BaseRun</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run ">jobs.Run</a>.</li>
<li>Added <code>IncludeMetrics</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest ">sql.ListQueryHistoryRequest</a>.</li>
<li>Added <code>StatementIds</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter ">sql.QueryFilter</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter ">sql.ContextFilter</a>.</li>
<li>Removed <code>ContextFilter</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter ">sql.QueryFilter</a>.</li>
<li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource ">sql.QuerySource</a>.</li>
</ul>
<p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date:
2024-08-21</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md ">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.45.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014 ">#1014</a>).</li>
<li>Do not specify <code>--tenant</code> flag when fetching managed
identity access token from the CLI (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021 ">#1021</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017 ">#1017</a>).</li>
<li>Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016 ">#1016</a>).</li>
<li>Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019 ">#1019</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI ">w.PolicyComplianceForClusters</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI ">w.PolicyComplianceForJobs</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI ">w.ResourceQuotas</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest ">catalog.GetQuotaRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse ">catalog.GetQuotaResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest ">catalog.ListQuotasRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse ">catalog.ListQuotasResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo ">catalog.QuotaInfo</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance ">compute.ClusterCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange ">compute.ClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest ">compute.EnforceClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse ">compute.EnforceClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest ">compute.GetClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse ">compute.GetClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest ">compute.ListClusterCompliancesRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse ">compute.ListClusterCompliancesResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange ">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest ">jobs.EnforcePolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse ">jobs.EnforcePolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest ">jobs.GetPolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse ">jobs.GetPolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance ">jobs.JobCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse ">jobs.ListJobComplianceForPolicyResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest ">jobs.ListJobComplianceRequest</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation ">catalog.CreateExternalLocation</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo ">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation ">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun ">jobs.BaseRun</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run ">jobs.Run</a>.</li>
<li>Added <code>IncludeMetrics</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest ">sql.ListQueryHistoryRequest</a>.</li>
<li>Added <code>StatementIds</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter ">sql.QueryFilter</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter ">sql.ContextFilter</a>.</li>
<li>Removed <code>ContextFilter</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter ">sql.QueryFilter</a>.</li>
<li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource ">sql.QuerySource</a>.</li>
</ul>
<p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date:
2024-08-21</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6d867882d0
"><code>6d86788</code></a>
[Release] Release v0.45.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023 ">#1023</a>)</li>
<li><a
href="ba4489b946
"><code>ba4489b</code></a>
[Fix] Do not specify <code>--tenant</code> flag when fetching managed
identity access to...</li>
<li><a
href="f6248097d1
"><code>f624809</code></a>
[Internal] Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019 ">#1019</a>)</li>
<li><a
href="27a5055609
"><code>27a5055</code></a>
[Internal] Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017 ">#1017</a>)</li>
<li><a
href="382a38d380
"><code>382a38d</code></a>
[Internal] Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016 ">#1016</a>)</li>
<li><a
href="1ef9931dc9
"><code>1ef9931</code></a>
[Fix] Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014 ">#1014</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0 ">compare
view</a></li>
</ul>
</details>
<br />
<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>
| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0 )](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
Shreyas Goenka
11dfdc036d
more iteration
2024-08-26 20:16:45 +02:00
Andrew Nester
783e05c939
Do not treat empty path as a local path ( #1717 )
...
## Changes
Fixes issue introduced here https://github.com/databricks/cli/pull/1699
where PyPi packages were treated as local library.
The reason is that `libraryPath` returns an empty string as a path for
PyPi packages and then `IsLibraryLocal` treated empty string as local
path.
Both of these functions are fixed in this PR.
## Tests
Added regression test
2024-08-26 10:03:56 +00:00
Lennart Kats (databricks)
84b47745e4
Ignore CLI version check on development builds of the CLI ( #1714 )
...
## Changes
This changes makes sure we ignore CLI version check on development
builds of the CLI.
Before:
```
$ cat databricks.yml | grep cli_version
databricks_cli_version: ">= 0.223.1"
$ cli bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 0.223.1, current: 0.0.0-dev+06b169284737
```
after
```
...
$ cli bundle deploy
...
Warning: Ignoring Databricks CLI version constraint for development build. Required: >= 0.223.1, current: 0.0.0-dev+d52d6f08fcd5
```
## Tests
<!-- How is this tested? -->
2024-08-23 10:13:21 +00:00
Shreyas Goenka
535f670868
remove most of the debug helpers
2024-08-22 17:43:22 +02:00
Shreyas Goenka
790731f981
remove generic schema test
2024-08-22 17:42:19 +02:00
Shreyas Goenka
39efb705de
-
2024-08-22 17:40:43 +02:00
Shreyas Goenka
7c70179091
add self referential tesT
2024-08-22 17:38:24 +02:00
Shreyas Goenka
08133126ae
Merge remote-tracking branch 'origin' into improve/json-schema
2024-08-22 17:27:49 +02:00
shreyas-goenka
7fe08c2386
Revert hc-install version to 0.7.0 ( #1711 )
...
## Changes
With hc-install version `0.8.0` there was a regression where debug logs
would be leaked into stderr. Reported upstream in
https://github.com/hashicorp/hc-install/issues/239 .
Meanwhile we need to revert and pin to version`0.7.0`. This PR also
includes a regression test.
## Tests
Regression test.
2024-08-22 15:04:26 +00:00
Andrew Nester
35e48be81c
[Release] Release v0.227.0 ( #1705 )
...
CLI:
* Added filtering flags for cluster list commands
([#1703 ](https://github.com/databricks/cli/pull/1703 )).
Bundles:
* Remove reference to "dbt" in the default-sql template
([#1696 ](https://github.com/databricks/cli/pull/1696 )).
* Pause continuous pipelines when 'mode: development' is used
([#1590 ](https://github.com/databricks/cli/pull/1590 )).
* Add configurable presets for name prefixes, tags, etc.
([#1490 ](https://github.com/databricks/cli/pull/1490 )).
* Report all empty resources present in error diagnostic
([#1685 ](https://github.com/databricks/cli/pull/1685 )).
* Improves detection of PyPI package names in environment dependencies
([#1699 ](https://github.com/databricks/cli/pull/1699 )).
* [DAB] Add support for requirements libraries in Job Tasks
([#1543 ](https://github.com/databricks/cli/pull/1543 )).
* Add paths field to bundle sync configuration
([#1694 ](https://github.com/databricks/cli/pull/1694 )).
Internal:
* Add `import` option for PyDABs
([#1693 ](https://github.com/databricks/cli/pull/1693 )).
* Make fileset take optional list of paths to list
([#1684 ](https://github.com/databricks/cli/pull/1684 )).
* Pass through paths argument to libs/sync
([#1689 ](https://github.com/databricks/cli/pull/1689 )).
* Correctly mark package names with versions as remote libraries
([#1697 ](https://github.com/databricks/cli/pull/1697 )).
* Share test initializer in common helper function
([#1695 ](https://github.com/databricks/cli/pull/1695 )).
* Make `pydabs/venv_path` optional
([#1687 ](https://github.com/databricks/cli/pull/1687 )).
* Use API mocks for duplicate path errors in workspace files extensions
client ([#1690 ](https://github.com/databricks/cli/pull/1690 )).
* Fix prefix preset used for UC schemas
([#1704 ](https://github.com/databricks/cli/pull/1704 )).
2024-08-22 08:44:22 +00:00
Pieter Noordhuis
6e8cd835a3
Add paths field to bundle sync configuration ( #1694 )
...
## Changes
This field allows a user to configure paths to synchronize to the
workspace.
Allowed values are relative paths to files and directories anchored at
the directory where the field is set. If one or more values traverse up
the directory tree (to an ancestor of the bundle root directory), the
CLI will dynamically determine the root path to use to ensure that the
file tree structure remains intact.
For example, given a `databricks.yml` in `my_bundle` that includes:
```yaml
sync:
paths:
- ../common
- .
```
Then upon synchronization, the workspace will look like:
```
.
├── common
│ └── lib.py
└── my_bundle
├── databricks.yml
└── notebook.py
```
If not set behavior remains identical.
## Tests
* Newly added unit tests for the mutators and under `bundle/tests`.
* Manually confirmed a bundle without this configuration works the same.
* Manually confirmed a bundle with this configuration works.
2024-08-21 15:33:25 +00:00
Andrew Nester
6f345293b1
Added filtering flags for cluster list commands ( #1703 )
...
## Changes
Fixes #1701
## Tests
```
Usage:
databricks clusters list [flags]
Flags:
--cluster-sources []string Filter clusters by source
--cluster-states []string Filter clusters by states
-h, --help help for list
--is-pinned Filter clusters by pinned status
--page-size int Use this field to specify the maximum number of results to be returned by the server.
--page-token string Use next_page_token or prev_page_token returned from the previous request to list the next or previous page of clusters respectively.
--policy-id string Filter clusters by policy id
```
2024-08-21 15:05:49 +00:00
shreyas-goenka
f5df211320
Fix prefix preset used for UC schemas ( #1704 )
...
## Changes
In https://github.com/databricks/cli/pull/1490 we regressed and started
using the development mode prefix for UC schemas regardless of the mode
of the bundle target.
This PR fixes the regression and adds a regression test
## Tests
Failing integration tests pass now.
2024-08-21 12:53:54 +00:00
Witold Czaplewski
192f33bb13
[DAB] Add support for requirements libraries in Job Tasks ( #1543 )
...
## Changes
While experimenting with DAB I discovered that requirements libraries
are being ignored.
One thing worth mentioning is that `bundle validate` runs successfully,
but `bundle deploy` fails. This PR only covers the second part.
## Tests
<!-- How is this tested? -->
Added a unit test
2024-08-21 10:03:56 +00:00
Andrew Nester
c775d251ed
Improves detection of PyPI package names in environment dependencies ( #1699 )
...
## Changes
Improves detection of PyPi package names in environment dependencies
## Tests
Added unit tests
2024-08-21 08:22:35 +00:00
shreyas-goenka
a4c1ba3e28
Use API mocks for duplicate path errors in workspace files extensions client ( #1690 )
...
## Changes
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName` recently started
failing in our nightlies because the upstream `import` API was changed
to [prohibit conflicting file
paths](https://docs.databricks.com/en/release-notes/product/2024/august.html#files-can-no-longer-have-identical-names-in-workspace-folders ).
Because existing conflicting file paths can still be grandfathered in,
we need to retain coverage for the test. To do this, this PR:
1. Removes the failing
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName`
2. Add an equivalent unit test with the `list` and `get-status` API
calls mocked.
2024-08-21 07:45:25 +00:00
Shreyas Goenka
00e5896966
add test for recursive
2024-08-20 21:53:04 +02:00
Shreyas Goenka
460eeb928d
more tests
2024-08-20 21:11:42 +02:00
Shreyas Goenka
e24725d230
added nested tests
2024-08-20 21:06:42 +02:00
Shreyas Goenka
b023ba0dd4
removed more tests
2024-08-20 19:57:57 +02:00
Shreyas Goenka
b89928513e
-
2024-08-20 19:10:50 +02:00
Shreyas Goenka
7db32fc211
-
2024-08-20 19:09:21 +02:00
Shreyas Goenka
65f1c750f6
-
2024-08-20 18:46:23 +02:00
Shreyas Goenka
6d2f88282f
more progress on the internal functionality
2024-08-20 18:40:30 +02:00
Shreyas Goenka
43325fdd0a
add some more todos:
2024-08-20 15:39:31 +02:00
Shreyas Goenka
75f252e51c
Improve JSON schema
2024-08-20 15:34:49 +02:00
Gleb Kanterov
44902fa350
Make `pydabs/venv_path` optional ( #1687 )
...
## Changes
Make `pydabs/venv_path` optional. When not specified, CLI detects the
Python interpreter using `python.DetectExecutable`, the same way as for
`artifacts`. `python.DetectExecutable` works correctly if a virtual
environment is activated or `python3` is available on PATH through other
means.
Extract the venv detection code from PyDABs into `libs/python/detect`.
This code will be used when we implement the `python/venv_path` section
in `databricks.yml`.
## Tests
Unit tests and manually
---------
Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-08-20 13:26:57 +00:00
Pieter Noordhuis
af5048e73e
Share test initializer in common helper function ( #1695 )
...
## Changes
These tests inadvertently re-ran mutators, the first time through
`loadTarget` and the second time by running `phases.Initialize()`
themselves. Some of the mutators that are executed in
`phases.Initialize()` are also run as part of `loadTarget`. This is
overdue a refactor to make it unambiguous what runs when. Until then,
this removes the duplicated execution.
## Tests
Unit tests pass.
2024-08-20 12:54:56 +00:00
Andrew Nester
6771ba09a6
Correctly mark package names with versions as remote libraries ( #1697 )
...
## Changes
Fixes https://github.com/databricks/setup-cli/issues/124
## Tests
Added regression test
2024-08-20 09:33:03 +00:00
shreyas-goenka
242d4b51ed
Report all empty resources present in error diagnostic ( #1685 )
...
## Changes
This PR addressed post-merge feedback from
https://github.com/databricks/cli/pull/1673 .
## Tests
Unit tests, and manually.
```
Error: experiment undefined-experiment is not defined
at resources.experiments.undefined-experiment
in databricks.yml:11:26
Error: job undefined-job is not defined
at resources.jobs.undefined-job
in databricks.yml:6:19
Error: pipeline undefined-pipeline is not defined
at resources.pipelines.undefined-pipeline
in databricks.yml:14:24
Name: undefined-job
Target: default
Found 3 errors
```
2024-08-20 00:22:00 +00:00
Lennart Kats (databricks)
78d0ac5c6a
Add configurable presets for name prefixes, tags, etc. ( #1490 )
...
## Changes
This adds configurable transformations based on the transformations
currently seen in `mode: development`.
Example databricks.yml showcasing how some transformations:
```
bundle:
name: my_bundle
targets:
dev:
presets:
prefix: "myprefix_" # prefix all resource names with myprefix_
pipelines_development: true # set development to true by default for pipelines
trigger_pause_status: PAUSED # set pause_status to PAUSED by default for all triggers and schedules
jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 by default for all jobs
tags:
dev: true
```
## Tests
* Existing process_target_mode tests that were adapted to use this new
code
* Unit tests specific for the new mutator
* Unit tests for config loading and merging
* Manual e2e testing
2024-08-19 18:18:50 +00:00
Lennart Kats (databricks)
07627023f5
Pause continuous pipelines when 'mode: development' is used ( #1590 )
...
## Changes
This makes it so that the pipelines `continuous` property is set to
false by default when using `mode: development`.
2024-08-19 16:27:57 +00:00
Lennart Kats (databricks)
8238a6ad0a
Remove reference to "dbt" in the default-sql template ( #1696 )
...
## Changes
The `default-sql` template inadvertently mentioned dbt in one of the
prompts. This PR removes that reference.
2024-08-19 15:47:18 +00:00