Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions agent-framework/TOC.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ items:
href: agents/observability.md
- name: Agent Skills
href: agents/skills.md
- name: Agent Safety
href: agents/safety.md
- name: Tools
items:
- name: Overview
Expand Down Expand Up @@ -152,6 +154,16 @@ items:
href: workflows/orchestrations/group-chat.md
- name: Magentic
href: workflows/orchestrations/magentic.md
- name: Advanced
items:
- name: Agent Executor
href: workflows/advanced/agent-executor.md
- name: Execution Modes
href: workflows/advanced/execution-modes.md
- name: Resettable Executors
href: workflows/advanced/resettable-executors.md
- name: Sub-Workflows
href: workflows/advanced/sub-workflows.md
- name: Integrations
items:
- name: Overview
Expand Down
5 changes: 1 addition & 4 deletions agent-framework/agents/conversations/compaction.md
Original file line number Diff line number Diff line change
Expand Up @@ -650,7 +650,4 @@ compacted = await apply_compaction(
## Next steps

> [!div class="nextstepaction"]
> [Context Providers](context-providers.md)

> [!div class="nextstepaction"]
> [Storage](storage.md)
> [Middleware](../middleware/index.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/conversations/storage.md
Original file line number Diff line number Diff line change
Expand Up @@ -361,4 +361,4 @@ resumed = AgentSession.from_dict(serialized)
## Next steps

> [!div class="nextstepaction"]
> [Running Agents](../running-agents.md)
> [Compaction](./compaction.md)
47 changes: 41 additions & 6 deletions agent-framework/agents/middleware/agent-vs-run-scope.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ When both are registered, agent-level middleware runs first (outermost), followe

:::zone pivot="programming-language-csharp"

In C#, middleware is registered on an agent using the builder pattern. Agent-level middleware is applied during agent construction, while run-level middleware can be provided via `AgentRunOptions`.
In C#, middleware is registered on an agent using the builder pattern with `.AsBuilder().Use(...).Build()`. Agent-level middleware is applied during agent construction and persists across all runs. Run-level middleware uses the same pattern but builds a decorated agent inline before calling `RunAsync` or `RunStreamingAsync`.

### Agent-level middleware

Expand All @@ -30,6 +30,7 @@ Agent-level middleware is registered at construction time and applies to every r
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Threading;
using System.Threading.Tasks;
using Azure.AI.OpenAI;
Expand All @@ -50,6 +51,20 @@ async Task<AgentResponse> SecurityMiddleware(
return response;
}

async IAsyncEnumerable<AgentResponseUpdate> SecurityStreamingMiddleware(
IEnumerable<ChatMessage> messages,
AgentSession? session,
AgentRunOptions? options,
AIAgent innerAgent,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
Console.WriteLine("[Security] Validating streaming request...");
await foreach (var update in innerAgent.RunStreamingAsync(messages, session, options, cancellationToken))
{
yield return update;
}
}

AIAgent baseAgent = new AzureOpenAIClient(
new Uri("https://<myresource>.openai.azure.com"),
new AzureCliCredential())
Expand All @@ -59,15 +74,15 @@ AIAgent baseAgent = new AzureOpenAIClient(
// Register middleware at the agent level
var agentWithMiddleware = baseAgent
.AsBuilder()
.Use(runFunc: SecurityMiddleware, runStreamingFunc: null)
.Use(runFunc: SecurityMiddleware, runStreamingFunc: SecurityStreamingMiddleware)
.Build();

Console.WriteLine(await agentWithMiddleware.RunAsync("What's the weather in Paris?"));
```

### Run-level middleware

Run-level middleware is provided per request via `AgentRunOptions`:
Run-level middleware uses the same builder pattern, applied inline for a specific invocation:

```csharp
// Run-level middleware: applied to a specific run only
Expand All @@ -84,11 +99,31 @@ async Task<AgentResponse> DebugMiddleware(
return response;
}

// Pass run-level middleware via AgentRunOptions for this specific call
var runOptions = new AgentRunOptions { RunMiddleware = DebugMiddleware };
Console.WriteLine(await baseAgent.RunAsync("What's the weather in Tokyo?", options: runOptions));
async IAsyncEnumerable<AgentResponseUpdate> DebugStreamingMiddleware(
IEnumerable<ChatMessage> messages,
AgentSession? session,
AgentRunOptions? options,
AIAgent innerAgent,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
Console.WriteLine($"[Debug] Input messages: {messages.Count()}");
await foreach (var update in innerAgent.RunStreamingAsync(messages, session, options, cancellationToken))
{
yield return update;
}
}

// Apply run-level middleware by building a decorated agent inline for this specific call
Console.WriteLine(await baseAgent
.AsBuilder()
.Use(runFunc: DebugMiddleware, runStreamingFunc: DebugStreamingMiddleware)
.Build()
.RunAsync("What's the weather in Tokyo?"));
```

> [!TIP]
> The `.AsBuilder().Use(...).Build()` pattern creates a lightweight wrapper around the original agent. You can chain multiple `.Use()` calls to compose several middleware for a single invocation.

:::zone-end

:::zone pivot="programming-language-python"
Expand Down
2 changes: 1 addition & 1 deletion agent-framework/agents/middleware/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -881,4 +881,4 @@ if __name__ == "__main__":
## Next steps

> [!div class="nextstepaction"]
> [Agent Background Responses](../background-responses.md)
> [Defining Middleware](./defining-middleware.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/middleware/runtime-context.md
Original file line number Diff line number Diff line change
Expand Up @@ -212,4 +212,4 @@ Use `function_invocation_kwargs` for tool-invocation flows and `client_kwargs` f
## Next steps

> [!div class="nextstepaction"]
> [Middleware Overview](./index.md)
> [Providers](../providers/index.md)
31 changes: 19 additions & 12 deletions agent-framework/agents/observability.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ var instrumentedChatClient = new AzureOpenAIClient(new Uri(endpoint), new Defaul
.GetChatClient(deploymentName)
.AsIChatClient() // Converts a native OpenAI SDK ChatClient into a Microsoft.Extensions.AI.IChatClient
.AsBuilder()
.UseOpenTelemetry(sourceName: "MyApplication", configure: (cfg) => cfg.EnableSensitiveData = true) // Enable OpenTelemetry instrumentation with sensitive data
.UseOpenTelemetry(sourceName: SourceName, configure: (cfg) => cfg.EnableSensitiveData = true) // Enable OpenTelemetry instrumentation with sensitive data
.Build();
```

Expand All @@ -46,7 +46,7 @@ var agent = new ChatClientAgent(
name: "OpenTelemetryDemoAgent",
instructions: "You are a helpful assistant that provides concise and informative responses.",
tools: [AIFunctionFactory.Create(GetWeatherAsync)]
).WithOpenTelemetry(sourceName: "MyApplication", enableSensitiveData: true); // Enable OpenTelemetry instrumentation with sensitive data
).WithOpenTelemetry(sourceName: SourceName, configure: (cfg) => cfg.EnableSensitiveData = true); // Enable OpenTelemetry instrumentation with sensitive data
```

> [!IMPORTANT]
Expand All @@ -70,7 +70,9 @@ using OpenTelemetry.Trace;
using OpenTelemetry.Resources;
using System;

var SourceName = "MyApplication";
// The source name under which all activities, metrics, and logs will be emitted.
const string SourceName = "MyApplication";
const string ServiceName = "AgentOpenTelemetry";

var applicationInsightsConnectionString = Environment.GetEnvironmentVariable("APPLICATION_INSIGHTS_CONNECTION_STRING")
?? throw new InvalidOperationException("APPLICATION_INSIGHTS_CONNECTION_STRING is not set.");
Expand All @@ -82,12 +84,13 @@ var resourceBuilder = ResourceBuilder
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.SetResourceBuilder(resourceBuilder)
.AddSource(SourceName)
.AddSource("*Microsoft.Extensions.AI") // Listen to the Experimental.Microsoft.Extensions.AI source for chat client telemetry.
.AddSource("*Microsoft.Extensions.Agents*") // Listen to the Experimental.Microsoft.Extensions.Agents source for agent telemetry.
.AddAzureMonitorTraceExporter(options => options.ConnectionString = applicationInsightsConnectionString)
.Build();
```

> [!TIP]
> The `AddSource` method is used to specify the source name which the provider will listen to. Make sure it matches the source name you used in your instrumentation code (e.g., `UseOpenTelemetry(sourceName: SourceName)`). If a source name is not specified in the instrumentation code, it will default to `Experimental.Microsoft.Agents.AI`, in which case you should use `AddSource("Experimental.Microsoft.Agents.AI")` in your tracer provider and meter provider configuration.

> [!TIP]
> Depending on your backend, you can use different exporters. For more information, see the [OpenTelemetry .NET documentation](https://opentelemetry.io/docs/instrumentation/net/exporters/). For local development, consider using the [Aspire Dashboard](#aspire-dashboard).

Expand All @@ -112,7 +115,6 @@ var resourceBuilder = ResourceBuilder
using var meterProvider = Sdk.CreateMeterProviderBuilder()
.SetResourceBuilder(resourceBuilder)
.AddSource(SourceName)
.AddMeter("*Microsoft.Agents.AI") // Agent Framework metrics
.AddAzureMonitorMetricExporter(options => options.ConnectionString = applicationInsightsConnectionString)
.Build();
```
Expand Down Expand Up @@ -154,8 +156,6 @@ Consider using the Aspire Dashboard as a quick way to visualize your traces and
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.SetResourceBuilder(resourceBuilder)
.AddSource(SourceName)
.AddSource("*Microsoft.Extensions.AI") // Listen to the Experimental.Microsoft.Extensions.AI source for chat client telemetry.
.AddSource("*Microsoft.Extensions.Agents*") // Listen to the Experimental.Microsoft.Extensions.Agents source for agent telemetry.
.AddOtlpExporter(options => options.Endpoint = new Uri("http://localhost:4317"))
.Build();
```
Expand All @@ -174,23 +174,27 @@ See a full example of an agent with OpenTelemetry enabled in the [Agent Framewor
## Dependencies

### Included packages

To enable observability in your Python application, the following OpenTelemetry packages are installed by default:

- [opentelemetry-api](https://pypi.org/project/opentelemetry-api/)
- [opentelemetry-sdk](https://pypi.org/project/opentelemetry-sdk/)
- [opentelemetry-semantic-conventions-ai](https://pypi.org/project/opentelemetry-semantic-conventions-ai/)


### Exporters

We do *not* install exporters by default to prevent unnecessary dependencies and potential issues with auto instrumentation. There is a large variety of exporters available for different backends, so you can choose the ones that best fit your needs.

Some common exporters you may want to install based on your needs:

- For gRPC protocol support: install `opentelemetry-exporter-otlp-proto-grpc`
- For HTTP protocol support: install `opentelemetry-exporter-otlp-proto-http`
- For Azure Application Insights: install `azure-monitor-opentelemetry`

Use the [OpenTelemetry Registry](https://opentelemetry.io/ecosystem/registry/?language=python&component=instrumentation) to find more exporters and instrumentation packages.

## Enable Observability (Python)

### Five patterns for configuring observability

We've identified multiple ways to configure observability in your application, depending on your needs:
Expand Down Expand Up @@ -330,6 +334,7 @@ The following environment variables control Agent Framework observability:
The `configure_otel_providers()` function automatically reads standard OpenTelemetry environment variables:

**OTLP Configuration** (for Aspire Dashboard, Jaeger, etc.):

- `OTEL_EXPORTER_OTLP_ENDPOINT` - Base endpoint for all signals (e.g., `http://localhost:4317`)
- `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT` - Traces-specific endpoint (overrides base)
- `OTEL_EXPORTER_OTLP_METRICS_ENDPOINT` - Metrics-specific endpoint (overrides base)
Expand All @@ -338,6 +343,7 @@ The `configure_otel_providers()` function automatically reads standard OpenTelem
- `OTEL_EXPORTER_OTLP_HEADERS` - Headers for all signals (e.g., `key1=value1,key2=value2`)

**Service Identification**:

- `OTEL_SERVICE_NAME` - Service name (default: `agent_framework`)
- `OTEL_SERVICE_VERSION` - Service version (default: package version)
- `OTEL_RESOURCE_ATTRIBUTES` - Additional resource attributes
Expand All @@ -356,7 +362,8 @@ Make sure you have your Foundry configured with a Azure Monitor instance, see [d
pip install azure-monitor-opentelemetry
```

#### Configure observability directly from the `AzureAIClient`:
#### Configure observability directly from the `AzureAIClient`

For Foundry projects, you can configure observability directly from the `AzureAIClient`:

```python
Expand All @@ -377,8 +384,8 @@ async def main():
> [!TIP]
> The arguments for `client.configure_azure_monitor()` are passed through to the underlying `configure_azure_monitor()` function from the `azure-monitor-opentelemetry` package, see [documentation](/python/api/overview/azure/monitor-opentelemetry-readme#usage) for details, we take care of setting the connection string and resource.

#### Configure azure monitor and optionally enable instrumentation

#### Configure azure monitor and optionally enable instrumentation:
For non-Foundry projects with Application Insights, make sure you setup a custom agent in Foundry, see [details](/azure/ai-foundry/control-plane/register-custom-agent).

Then run your agent with the same _OpenTelemetry agent ID_ as registered in Foundry, and configure azure monitor as follows:
Expand Down Expand Up @@ -572,4 +579,4 @@ if __name__ == "__main__":
## Next steps

> [!div class="nextstepaction"]
> [Tools overview](tools/index.md)
> [Agent Skills](skills.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -515,4 +515,4 @@ See the [Agent getting started tutorials](../../get-started/your-first-agent.md)
## Next steps

> [!div class="nextstepaction"]
> [Azure AI Agents](./microsoft-foundry.md)
> [Ollama](./ollama.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/copilot-studio.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,4 +48,4 @@ Console.WriteLine(await agent.RunAsync("What are our company policies on remote
## Next steps

> [!div class="nextstepaction"]
> [Providers Overview](./index.md)
> [Custom Provider](./custom.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/github-copilot.md
Original file line number Diff line number Diff line change
Expand Up @@ -396,4 +396,4 @@ For more information on how to run and interact with agents, see the [Agent gett
## Next steps

> [!div class="nextstepaction"]
> [Custom Agents](./custom.md)
> [Copilot Studio](./copilot-studio.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/microsoft-foundry.md
Original file line number Diff line number Diff line change
Expand Up @@ -604,4 +604,4 @@ For more information on how to run and interact with agents, see the [Agent gett
## Next steps

> [!div class="nextstepaction"]
> [Foundry Models based Agents](./microsoft-foundry.md)
> [Anthropic](./anthropic.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,4 +205,4 @@ async def streaming_example():
## Next steps

> [!div class="nextstepaction"]
> [Providers Overview](./index.md)
> [GitHub Copilot](./github-copilot.md)
2 changes: 1 addition & 1 deletion agent-framework/agents/providers/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -320,4 +320,4 @@ For more information, see the [Get Started tutorials](../../get-started/your-fir
## Next steps

> [!div class="nextstepaction"]
> [Azure OpenAI](./azure-openai.md)
> [Microsoft Foundry](./microsoft-foundry.md)
Loading