Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 46 additions & 8 deletions enterprise/streaming-events.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,26 +15,40 @@
- **Rich semantics**: Built-in support for traces, logs, and metrics with standardized attribute naming conventions
- **Correlation**: Trace IDs link related events across agent invocations, LLM calls, and workforce executions

You don't need to run an OTEL collector to use this feature. Events are delivered directly to your S3 bucket where you can:
You don't need to run an OTEL collector to use this feature. Events are delivered directly to your destination where you can:

- Query them directly using Athena, BigQuery, or similar tools
- Ingest into your data lake (Snowflake, Databricks, etc.)

Check warning on line 21 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L21

Did you really mean 'Databricks'?
- Forward to any OTEL-compatible backend for visualization and alerting

<Note>Enterprise customers can enable PII redaction to automatically protect sensitive information in logs. Contact your Account Manager to learn more.</Note>

<Note>**S3 is currently the only supported destination**. Support for direct OTEL collector endpoints is on our roadmap.</Note>
<Note>Supported destinations are Amazon S3 and Databricks Delta Sharing. Support for direct OTEL collector endpoints is on our roadmap.</Note>

---

## Supported destinations

### Amazon S3

Stream events to an S3 bucket in your AWS account. Use this option if you want to query data directly with Athena, ingest into a data lake, or route events to an OTEL-compatible backend. Relevance writes gzipped OTEL JSON files to a bucket and prefix you specify.

Check warning on line 34 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L34

Did you really mean 'gzipped'?

### Databricks Delta Sharing

Check warning on line 36 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L36

Did you really mean 'Databricks'?

Stream events directly into your Databricks environment via Delta Sharing with native Unity Catalog integration. Use this option if your organization runs Databricks and wants events available as Delta tables for SQL analytics, notebooks, or downstream pipelines without an intermediate S3 step.

Check warning on line 38 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L38

Did you really mean 'Databricks'?

Check warning on line 38 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L38

Did you really mean 'Databricks'?

---

## Setup

### Prerequisites
### Amazon S3 setup

#### Prerequisites

- AWS account with permissions to create S3 buckets and bucket policies
- Relevance AI Enterprise plan

### 1. Create an S3 Bucket
#### 1. Create an S3 bucket

Create a bucket in the **same AWS region** as your Relevance data:

Expand All @@ -44,7 +58,7 @@
| Europe | `eu-west-2` (London) |
| US | `us-east-1` (N. Virginia) |

### 2. Configure Bucket Policy
#### 2. Configure bucket policy

Add this policy to allow Relevance to write events to your bucket:

Expand Down Expand Up @@ -73,7 +87,7 @@
- `YOUR_BUCKET_NAME`: Your S3 bucket name
- `RELEVANCE_EVENT_CONSUMER_ROLE_ARN`: Contact your Relevance team for the region-specific IAM role ARN

### 3. Provide Configuration to Relevance
#### 3. Provide configuration to Relevance

Send your Account Manager or support team:

Expand All @@ -83,11 +97,35 @@

---

### Databricks Delta Sharing setup

#### Prerequisites

- Unity Catalog-enabled Databricks workspace
- Relevance AI Enterprise plan
- Databricks admin access

#### Configuration

Databricks Delta Sharing is not self-serve. Configuration is handled by the Relevance AI team via an internal admin process. Relevance provisions a Delta Sharing share directly into your Unity Catalog, so events appear as native Delta tables in your Databricks environment without requiring S3 or a separate ingestion pipeline.

To get started, provide your Account Manager with:

- **Databricks workspace URL**: Your Databricks workspace URL
- **Unity Catalog metastore ID**: The metastore where events should be shared
- **Recipient identifier**: The Delta Sharing recipient name or email for your Databricks account

Your Account Manager will coordinate with the Relevance team to complete the setup.

<Warning>There is no self-serve UI to configure Databricks Delta Sharing. Contact your Account Manager to enable this Enterprise feature for your organization.</Warning>

---

## PII Redaction (Enterprise Feature)

PII (Personally Identifiable Information) redaction is an org-level feature that automatically scrubs sensitive information like email addresses, phone numbers, credit card numbers, and names before your event data leaves the platform and is delivered to your S3 destination.
PII (Personally Identifiable Information) redaction is an org-level feature that automatically scrubs sensitive information like email addresses, phone numbers, credit card numbers, and names before your event data leaves the platform and is delivered to your export destination.

PII redaction applies at the point of data export, specifically when telemetry and audit logs are being written to your S3 bucket. It does not apply to live agent conversations in real-time or data stored internally on Relevance AI's side. Think of it as a "scrub before delivery" mechanism for your downstream data pipeline.
PII redaction applies at the point of data export, specifically when telemetry and audit logs are being written to your destination. It does not apply to live agent conversations in real-time or data stored internally on Relevance AI's side. Think of it as a "scrub before delivery" mechanism for your downstream data pipeline.

<Note>PII redaction is a contractual Enterprise feature. Contact your Account Manager to enable this capability for your organization.</Note>

Expand Down Expand Up @@ -478,7 +516,7 @@

**Name**: `chat {model_name}`

<Warning>This span was previously named `llm_completion`. If you have existing dashboards or queries filtering on `gen_ai.operation.name = "llm_completion"` or span names containing `llm_completion`, update them to use `"chat"`. The rename aligns with the [OpenTelemetry GenAI semantic conventions](https://opentelemetry.io/docs/specs/semconv/gen-ai/).</Warning>

Check warning on line 519 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L519

Did you really mean 'llm_completion'?

Check warning on line 519 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L519

Did you really mean 'llm_completion'?

Check warning on line 519 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L519

Did you really mean 'llm_completion'?

| Attribute | Type | Required | Description |
| --- | --- | --- | --- |
Expand All @@ -498,7 +536,7 @@

**Attribute**: `gen_ai.tool.definitions`

<Warning>`gen_ai.tool.definitions` was previously an array type (`arrayValue`). It is now a JSON string (`stringValue`). If you consume this field in your pipeline or queries, you need to call `JSON.parse()` on the value to work with the tool definitions as structured data. The content is identical - the same tool definitions are present, just serialized as a string for better compatibility across OTEL exporters and backends.</Warning>

Check warning on line 539 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L539

Did you really mean 'arrayValue'?

Check warning on line 539 in enterprise/streaming-events.mdx

View check run for this annotation

Mintlify / Mintlify Validation (relevanceai) - vale-spellcheck

enterprise/streaming-events.mdx#L539

Did you really mean 'stringValue'?

### `multi_agent_system_trigger`

Expand Down