feat(sinks): add new databricks_zerobus for Databricks ingestion#24840
feat(sinks): add new databricks_zerobus for Databricks ingestion#24840flaviofcruz wants to merge 8 commits intovectordotdev:masterfrom
Conversation
2368e4a to
42bf043
Compare
|
Thanks @flaviofcruz for this new integration! Apologies for the slow review on this one. @codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 436d0da4bd
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
436d0da to
9dcb5d1
Compare
|
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9dcb5d1e71
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
FYI I am waiting for @hsuanyi to sign the CLA (see comment) before reviewing this further. Also, there is an resolved review comment. |
@flaviofcruz in case you missed the above, we will require all profiles who contributed to this PR to sign the CLA. Happy to review once that is done. |
|
I have read the CLA Document and I hereby sign the CLA |
|
recheck |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 050409defd
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
|
@pront really appreciate your work for the review. However, I was looking at the zerobus SDK license and the license could be problematic: https://github.com/databricks/zerobus-sdk/blob/main/LICENSE Do you know if this could be a blocker? Let me know if that is. |
050409d to
36a74ef
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 36a74ef530
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9d7b620bfb
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| // Remove the stream from the slot so the next retry creates a fresh one, | ||
| // then try to close gracefully. Dropping the slot's Arc first means our | ||
| // local `stream` may be the sole owner, allowing `Arc::get_mut` to succeed. | ||
| self.stream.lock().await.take(); |
There was a problem hiding this comment.
Guard stream reset against stale concurrent failures
When a retryable ingest error occurs, this unconditionally clears the shared stream slot, even if another task has already replaced it with a fresh stream. In concurrent ingest (which this sink uses), an old in-flight request can fail after recovery and then remove the new healthy stream, causing unnecessary reconnect churn and extra retries. Reset logic should only clear the slot if it still points to the same Arc that failed.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
This seems a real issue, now checking if the existing stream is the same as the one we hold so that we take it only when that happens.
| client_secret: &str, | ||
| ) -> Result<UnityCatalogTableSchema, ZerobusSinkError> { | ||
| let http_client = | ||
| HttpClient::new(TlsSettings::default(), &ProxyConfig::default()).map_err(|e| { |
There was a problem hiding this comment.
Use configured proxy for Unity Catalog schema fetches
Schema discovery always constructs its HTTP client with ProxyConfig::default(), so user-configured proxy settings from sink context are ignored. In environments that require egress via proxy, schema.type = "unity_catalog" can fail at startup even though the sink is otherwise configured correctly. This should use the sink/runtime proxy configuration instead of hardcoded defaults.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Done for the UC call, sadly Zerobus SDK doesn't seem to support changing a separate ProxyConfig.
License is being changed to Apache 2 here: databricks/zerobus-sdk#198 Thanks for the patience :) |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 059cb46452
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| let element_type = parse_complex_type(element_type_json)?; | ||
| Ok(ComplexType::Array(Box::new(element_type))) |
There was a problem hiding this comment.
Preserve array element nullability from Unity schema
parse_array_type only reads elementType and immediately constructs ComplexType::Array, so Unity Catalog's containsNull flag is discarded. That makes schemas like array<...> with containsNull = true look fully representable even though protobuf repeated fields cannot encode a true null element, which can cause runtime encode failures (or forced coercion) when events include null entries in those arrays. The descriptor generator should either reject nullable-element arrays up front or model them with an explicit wrapper that can encode nullability.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
This shouldn't matter because the source are non nullable map key/values and non nullable array elements.
05e3f2e to
a738888
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: a738888965
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
a738888 to
171d5cd
Compare
|
I have read the CLA Document and I hereby sign the CLA |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 171d5cd2ea
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| Ok(Some(offset)) if self.require_acknowledgements => { | ||
| stream.wait_for_offset(offset).await.map(|_| ()) | ||
| } | ||
| Ok(_) => Ok(()), |
There was a problem hiding this comment.
Fail when ack mode receives no ingest offset
When acknowledgements are enabled, this branch still treats ingest_records_offset returning Ok(None) as success, so events are marked delivered without waiting for any server-side confirmation. If the SDK ever returns None (e.g., offset not available for a successful ingest call), the sink violates its advertised acknowledgement semantics and can acknowledge data before it is durably accepted; in ack mode this should be surfaced as an error instead of falling through to Ok(()).
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
This only happens if we ever send a batch with 0 elements. Still I had an error case for this situation but the code should never reach this state.
171d5cd to
2971be3
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 2971be31bb
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
2971be3 to
14d3847
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 14d3847388
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 0e0c4e45c1
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| prost-types = { workspace = true, optional = true } | ||
|
|
||
| # Databricks Zerobus | ||
| databricks-zerobus-ingest-sdk = { git = "https://github.com/databricks/zerobus-sdk-rs", rev = "a963e81", optional = true } |
There was a problem hiding this comment.
Regenerate the third-party license inventory
This adds the new databricks-zerobus-ingest-sdk dependency and updates Cargo.lock, but LICENSE-3rdparty.csv was not updated (a repo search for databricks/zerobus only finds the lockfile entry). Vector's dependency license check expects new locked crates to be reflected in the license inventory, so this PR can fail CI or ship the SDK without the recorded license metadata; please run make build-licenses and commit the generated changes.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
I assume we might still need databricks/zerobus-sdk#225
There was a problem hiding this comment.
It passed now, licenses file updated!
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 2dae9ad473
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| if self.table_name.matches('.').count() != 2 { | ||
| return Err(ZerobusSinkError::ConfigError { | ||
| message: "table_name must be in format 'catalog.schema.table' (exactly 3 parts)" |
There was a problem hiding this comment.
Reject empty table-name segments during config validation
The table_name validation only counts dots, so values like catalog..table, .schema.table, or catalog.schema. pass validation even though they are not valid catalog.schema.table identifiers. These malformed values then fail later during Unity Catalog calls with a less actionable remote error, despite this function claiming to enforce exactly three parts. Split on . and ensure all three segments are non-empty to keep startup failures deterministic and user-facing.
Useful? React with 👍 / 👎.
2dae9ad to
c73e6c9
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c73e6c9565
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| if !proxy.enabled { | ||
| return None; |
There was a problem hiding this comment.
Override SDK proxy autodetection when proxy is disabled
When proxy.enabled is false, this function returns None, which leaves ZerobusSdk on its default environment-variable proxy autodetection path (the behavior is even documented in the surrounding comments). In environments where HTTP_PROXY/HTTPS_PROXY is set globally, Zerobus traffic will still be proxied despite explicit Vector config to disable proxies, causing unexpected routing and startup/ingest failures in no-proxy deployments. Instead of returning None in this branch, provide an explicit connector factory that forces direct connections.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 36e4d92f8a
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1941f62f41
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
f7494f4 to
83c74f4
Compare
|
FYI I will come back for a final review after @petere-datadog approves. |
…ta into Databricks
…ck up the environment variables
…f no URL is set up
0e202a3 to
73f567a
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 73f567a675
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| } | ||
| } | ||
|
|
||
| if let Some(max_bytes) = self.batch.max_bytes { |
There was a problem hiding this comment.
Enforce Zerobus 10MB cap when
batch.max_bytes is unset
The validation only checks the 10MB SDK limit when batch.max_bytes is Some(...), so a config that explicitly sets batch.max_bytes: null can bypass this guard and produce oversized requests. Because into_batcher_settings() will then allow size-unbounded batches (bounded only by timeout/max_events), the sink can send >10MB payloads that the Zerobus SDK rejects at ingest time, turning a deterministic startup validation error into runtime delivery failures.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
AFAIK validate will merge the default. Added test to confirm that indeed this is what happens - we always cap at 10MB batch size.
Summary
Databricks provides a Zerobus ingest connector [1], a push based API that writes data directly into Unity Catalog Delta tables. This PR introduces a new vector sink that integrates with Databricks, allowing Vector to push data into Databricks. We use the Databricks provided SDK to implement the sink [2].
Zerobus supports row level ingestion and that's we do here. Zerobus also has arrow batch in experimental mode but we didn't add support for it. We will swap the row level ingestion once it becomes stable and that will be the future default.
With row based ingestion, we extended the BatchSerializerConfig to support a batch serializer that creates vector's of protocol buffer bytes. This makes it the second option for doing batch serialization, along arrow batch.
Users do not have to specify the schema at all, we will fetch the schema for them from Unity Catalog and then use on the API. If users want to do schema changes, they should update their table as needed. We don't have a lot of support for dynamic schema changes at the moment.
Vector configuration
How did you test this PR?
Unit tests, running small toy examples and using it in production for actual traffic.
Change Type
Is this a breaking change?
Does this PR include user facing changes?
no-changeloglabel to this PR.References
[1] https://docs.databricks.com/aws/en/ingestion/zerobus-overview
[2] https://github.com/databricks/zerobus-sdk
Notes
@vectordotdev/vectorto reach out to us regarding this PR.pre-pushhook, please see this template.make fmtmake check-clippy(if there are failures it's possible some of them can be fixed withmake clippy-fix)make testgit merge origin masterandgit push.Cargo.lock), pleaserun
make build-licensesto regenerate the license inventory and commit the changes (if any). More details here.