mirror of https://github.com/databricks/cli.git
build(deps): bump github.com/databricks/databricks-sdk-go from 0.57.0 to 0.58.1 (#2357)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.57.0 to 0.58.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.58.1</h2> <h3>Internal Changes</h3> <ul> <li>Do not send ForceSendFields as query parameters.</li> </ul> <h2>v0.58.0</h2> <h2>[Release] Release v0.58.0</h2> <h3>New Features and Improvements</h3> <ul> <li>Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143">#1143</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142">#1142</a>).</li> <li>Introduce new TokenSource interface that takes a <code>context.Context</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141">#1141</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>GetMessageQueryResultByAttachment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI">w.Genie</a> workspace-level service.</li> <li>Added <code>Id</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li> <li>Added <code>LimitConfig</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest">billing.UpdateBudgetPolicyRequest</a>.</li> <li>Added <code>Volumes</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf">compute.ClusterLogConf</a>.</li> <li>Removed <code>ReviewState</code>, <code>Reviews</code> and <code>RunnerCollaborators</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li> </ul> <p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date: 2025-02-11</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>[Release] Release v0.58.1</h2> <h3>Internal Changes</h3> <ul> <li>Do not send ForceSendFields as query parameters.</li> </ul> <h2>[Release] Release v0.58.0</h2> <h3>New Features and Improvements</h3> <ul> <li>Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143">#1143</a>).</li> </ul> <h3>Internal Changes</h3> <ul> <li>Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142">#1142</a>).</li> <li>Introduce new TokenSource interface that takes a <code>context.Context</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141">#1141</a>).</li> </ul> <h3>API Changes:</h3> <ul> <li>Added <code>GetMessageQueryResultByAttachment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI">w.Genie</a> workspace-level service.</li> <li>Added <code>Id</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li> <li>Added <code>LimitConfig</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest">billing.UpdateBudgetPolicyRequest</a>.</li> <li>Added <code>Volumes</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf">compute.ClusterLogConf</a>.</li> <li>Added .</li> <li>Removed <code>ReviewState</code>, <code>Reviews</code> and <code>RunnerCollaborators</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li> </ul> <p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date: 2025-02-11</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="967d0632b7
"><code>967d063</code></a> [Release] Release v0.58.1 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1146">#1146</a>)</li> <li><a href="9dc3c56fb0
"><code>9dc3c56</code></a> [Release] Release v0.58.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1144">#1144</a>)</li> <li><a href="8307a4d467
"><code>8307a4d</code></a> [Feature] Enable async refreshes for OAuth tokens (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1143">#1143</a>)</li> <li><a href="815cace601
"><code>815cace</code></a> [Internal] Add support for asynchronous data plane token refreshes (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1142">#1142</a>)</li> <li><a href="3aebd68bf3
"><code>3aebd68</code></a> [Internal] Introduce new TokenSource interface that takes a <code>context.Context</code>...</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.57.0...v0.58.1">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
This commit is contained in:
parent
25a701be92
commit
764142978c
|
@ -1 +1 @@
|
|||
c72c58f97b950fcb924a90ef164bcb10cfcd5ece
|
||||
99f644e72261ef5ecf8d74db20f4b7a1e09723cc
|
|
@ -19,6 +19,9 @@ github.com/databricks/cli/bundle/config/resources.App:
|
|||
"description":
|
||||
"description": |-
|
||||
The description of the app.
|
||||
"id":
|
||||
"description": |-
|
||||
The unique identifier of the app.
|
||||
"name":
|
||||
"description": |-
|
||||
The name of the app. The name must contain only lowercase alphanumeric characters and hyphens.
|
||||
|
@ -67,7 +70,7 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
|
|||
"cluster_log_conf":
|
||||
"description": |-
|
||||
The configuration for delivering spark logs to a long-term storage destination.
|
||||
Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified
|
||||
Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified
|
||||
for one cluster. If the conf is given, the logs will be delivered to the destination every
|
||||
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
|
||||
the destination of executor logs is `$destination/$clusterId/executor`.
|
||||
|
@ -1009,6 +1012,10 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterLogConf:
|
|||
`{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }`
|
||||
Cluster iam role is used to access s3, please make sure the cluster iam role in
|
||||
`instance_profile_arn` has permission to write data to the s3 destination.
|
||||
"volumes":
|
||||
"description": |-
|
||||
destination needs to be provided. e.g.
|
||||
`{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
|
||||
github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
|
||||
"apply_policy_default_values":
|
||||
"description": |-
|
||||
|
@ -1034,7 +1041,7 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
|
|||
"cluster_log_conf":
|
||||
"description": |-
|
||||
The configuration for delivering spark logs to a long-term storage destination.
|
||||
Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified
|
||||
Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified
|
||||
for one cluster. If the conf is given, the logs will be delivered to the destination every
|
||||
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
|
||||
the destination of executor logs is `$destination/$clusterId/executor`.
|
||||
|
@ -1428,7 +1435,7 @@ github.com/databricks/databricks-sdk-go/service/compute.S3StorageInfo:
|
|||
github.com/databricks/databricks-sdk-go/service/compute.VolumesStorageInfo:
|
||||
"destination":
|
||||
"description": |-
|
||||
Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh`
|
||||
Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
|
||||
github.com/databricks/databricks-sdk-go/service/compute.WorkloadType:
|
||||
"clients":
|
||||
"description": |2-
|
||||
|
@ -2985,7 +2992,7 @@ github.com/databricks/databricks-sdk-go/service/serving.ExternalModel:
|
|||
PaLM Config. Only required if the provider is 'palm'.
|
||||
"provider":
|
||||
"description": |-
|
||||
The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', and 'palm'.
|
||||
The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', 'palm', and 'custom'.
|
||||
"task":
|
||||
"description": |-
|
||||
The task type of the external model.
|
||||
|
|
|
@ -88,6 +88,10 @@
|
|||
"description": {
|
||||
"$ref": "#/$defs/string"
|
||||
},
|
||||
"id": {
|
||||
"description": "The unique identifier of the app.",
|
||||
"$ref": "#/$defs/string"
|
||||
},
|
||||
"name": {
|
||||
"$ref": "#/$defs/string"
|
||||
},
|
||||
|
@ -160,7 +164,7 @@
|
|||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.AzureAttributes"
|
||||
},
|
||||
"cluster_log_conf": {
|
||||
"description": "The configuration for delivering spark logs to a long-term storage destination.\nTwo kinds of destinations (dbfs and s3) are supported. Only one destination can be specified\nfor one cluster. If the conf is given, the logs will be delivered to the destination every\n`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while\nthe destination of executor logs is `$destination/$clusterId/executor`.",
|
||||
"description": "The configuration for delivering spark logs to a long-term storage destination.\nThree kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified\nfor one cluster. If the conf is given, the logs will be delivered to the destination every\n`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while\nthe destination of executor logs is `$destination/$clusterId/executor`.",
|
||||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.ClusterLogConf"
|
||||
},
|
||||
"cluster_name": {
|
||||
|
@ -2495,6 +2499,10 @@
|
|||
"s3": {
|
||||
"description": "destination and either the region or endpoint need to be provided. e.g.\n`{ \"s3\": { \"destination\" : \"s3://cluster_log_bucket/prefix\", \"region\" : \"us-west-2\" } }`\nCluster iam role is used to access s3, please make sure the cluster iam role in\n`instance_profile_arn` has permission to write data to the s3 destination.",
|
||||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.S3StorageInfo"
|
||||
},
|
||||
"volumes": {
|
||||
"description": "destination needs to be provided. e.g.\n`{ \"volumes\" : { \"destination\" : \"/Volumes/catalog/schema/volume/cluster_log\" } }`",
|
||||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.VolumesStorageInfo"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
|
@ -2531,7 +2539,7 @@
|
|||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.AzureAttributes"
|
||||
},
|
||||
"cluster_log_conf": {
|
||||
"description": "The configuration for delivering spark logs to a long-term storage destination.\nTwo kinds of destinations (dbfs and s3) are supported. Only one destination can be specified\nfor one cluster. If the conf is given, the logs will be delivered to the destination every\n`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while\nthe destination of executor logs is `$destination/$clusterId/executor`.",
|
||||
"description": "The configuration for delivering spark logs to a long-term storage destination.\nThree kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified\nfor one cluster. If the conf is given, the logs will be delivered to the destination every\n`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while\nthe destination of executor logs is `$destination/$clusterId/executor`.",
|
||||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/compute.ClusterLogConf"
|
||||
},
|
||||
"cluster_name": {
|
||||
|
@ -3116,7 +3124,7 @@
|
|||
"type": "object",
|
||||
"properties": {
|
||||
"destination": {
|
||||
"description": "Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh`",
|
||||
"description": "Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`",
|
||||
"$ref": "#/$defs/string"
|
||||
}
|
||||
},
|
||||
|
@ -6077,7 +6085,7 @@
|
|||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/serving.PaLmConfig"
|
||||
},
|
||||
"provider": {
|
||||
"description": "The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', and 'palm'.",
|
||||
"description": "The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', 'palm', and 'custom'.",
|
||||
"$ref": "#/$defs/github.com/databricks/databricks-sdk-go/service/serving.ExternalModelProvider"
|
||||
},
|
||||
"task": {
|
||||
|
|
|
@ -305,6 +305,8 @@ func newUpdate() *cobra.Command {
|
|||
// TODO: short flags
|
||||
cmd.Flags().Var(&updateJson, "json", `either inline JSON string or @path/to/file.json with request body`)
|
||||
|
||||
// TODO: complex arg: limit_config
|
||||
|
||||
// TODO: array: custom_tags
|
||||
cmd.Flags().StringVar(&updateReq.Policy.PolicyName, "policy-name", updateReq.Policy.PolicyName, `The name of the policy.`)
|
||||
|
||||
|
|
|
@ -40,6 +40,7 @@ func New() *cobra.Command {
|
|||
cmd.AddCommand(newExecuteMessageQuery())
|
||||
cmd.AddCommand(newGetMessage())
|
||||
cmd.AddCommand(newGetMessageQueryResult())
|
||||
cmd.AddCommand(newGetMessageQueryResultByAttachment())
|
||||
cmd.AddCommand(newStartConversation())
|
||||
|
||||
// Apply optional overrides to this command.
|
||||
|
@ -344,6 +345,71 @@ func newGetMessageQueryResult() *cobra.Command {
|
|||
return cmd
|
||||
}
|
||||
|
||||
// start get-message-query-result-by-attachment command
|
||||
|
||||
// Slice with functions to override default command behavior.
|
||||
// Functions can be added from the `init()` function in manually curated files in this directory.
|
||||
var getMessageQueryResultByAttachmentOverrides []func(
|
||||
*cobra.Command,
|
||||
*dashboards.GenieGetQueryResultByAttachmentRequest,
|
||||
)
|
||||
|
||||
func newGetMessageQueryResultByAttachment() *cobra.Command {
|
||||
cmd := &cobra.Command{}
|
||||
|
||||
var getMessageQueryResultByAttachmentReq dashboards.GenieGetQueryResultByAttachmentRequest
|
||||
|
||||
// TODO: short flags
|
||||
|
||||
cmd.Use = "get-message-query-result-by-attachment SPACE_ID CONVERSATION_ID MESSAGE_ID ATTACHMENT_ID"
|
||||
cmd.Short = `Get conversation message SQL query result by attachment id.`
|
||||
cmd.Long = `Get conversation message SQL query result by attachment id.
|
||||
|
||||
Get the result of SQL query by attachment id This is only available if a
|
||||
message has a query attachment and the message status is EXECUTING_QUERY.
|
||||
|
||||
Arguments:
|
||||
SPACE_ID: Genie space ID
|
||||
CONVERSATION_ID: Conversation ID
|
||||
MESSAGE_ID: Message ID
|
||||
ATTACHMENT_ID: Attachment ID`
|
||||
|
||||
cmd.Annotations = make(map[string]string)
|
||||
|
||||
cmd.Args = func(cmd *cobra.Command, args []string) error {
|
||||
check := root.ExactArgs(4)
|
||||
return check(cmd, args)
|
||||
}
|
||||
|
||||
cmd.PreRunE = root.MustWorkspaceClient
|
||||
cmd.RunE = func(cmd *cobra.Command, args []string) (err error) {
|
||||
ctx := cmd.Context()
|
||||
w := root.WorkspaceClient(ctx)
|
||||
|
||||
getMessageQueryResultByAttachmentReq.SpaceId = args[0]
|
||||
getMessageQueryResultByAttachmentReq.ConversationId = args[1]
|
||||
getMessageQueryResultByAttachmentReq.MessageId = args[2]
|
||||
getMessageQueryResultByAttachmentReq.AttachmentId = args[3]
|
||||
|
||||
response, err := w.Genie.GetMessageQueryResultByAttachment(ctx, getMessageQueryResultByAttachmentReq)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return cmdio.Render(ctx, response)
|
||||
}
|
||||
|
||||
// Disable completions since they are not applicable.
|
||||
// Can be overridden by manual implementation in `override.go`.
|
||||
cmd.ValidArgsFunction = cobra.NoFileCompletions
|
||||
|
||||
// Apply optional overrides to this command.
|
||||
for _, fn := range getMessageQueryResultByAttachmentOverrides {
|
||||
fn(cmd, &getMessageQueryResultByAttachmentReq)
|
||||
}
|
||||
|
||||
return cmd
|
||||
}
|
||||
|
||||
// start start-conversation command
|
||||
|
||||
// Slice with functions to override default command behavior.
|
||||
|
|
2
go.mod
2
go.mod
|
@ -9,7 +9,7 @@ require (
|
|||
github.com/BurntSushi/toml v1.4.0 // MIT
|
||||
github.com/Masterminds/semver/v3 v3.3.1 // MIT
|
||||
github.com/briandowns/spinner v1.23.1 // Apache 2.0
|
||||
github.com/databricks/databricks-sdk-go v0.57.0 // Apache 2.0
|
||||
github.com/databricks/databricks-sdk-go v0.58.1 // Apache 2.0
|
||||
github.com/fatih/color v1.18.0 // MIT
|
||||
github.com/google/uuid v1.6.0 // BSD-3-Clause
|
||||
github.com/gorilla/mux v1.8.1 // BSD 3-Clause
|
||||
|
|
|
@ -34,8 +34,8 @@ github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGX
|
|||
github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
|
||||
github.com/cyphar/filepath-securejoin v0.2.5 h1:6iR5tXJ/e6tJZzzdMc1km3Sa7RRIVBKAK32O2s7AYfo=
|
||||
github.com/cyphar/filepath-securejoin v0.2.5/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
|
||||
github.com/databricks/databricks-sdk-go v0.57.0 h1:Vs3a+Zmg403er4+xpD7ZTQWm7e51d2q3yYEyIIgvtYw=
|
||||
github.com/databricks/databricks-sdk-go v0.57.0/go.mod h1:JpLizplEs+up9/Z4Xf2x++o3sM9eTTWFGzIXAptKJzI=
|
||||
github.com/databricks/databricks-sdk-go v0.58.1 h1:dUs9ZmFi7hYiL3NwLSAbxqQu66E3BzwM8EU/wcCTJ10=
|
||||
github.com/databricks/databricks-sdk-go v0.58.1/go.mod h1:JpLizplEs+up9/Z4Xf2x++o3sM9eTTWFGzIXAptKJzI=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
|
|
Loading…
Reference in New Issue