-
-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Labels
Description
- OpenTelemetry #34
- Add Prometheus Exporter for OpenTelemetry .NET
- Report events (reconnect, processed messages count, etc.)
- Versioning & Source Generator #24
- Introduce parallelism
- Aggregation of consume messages (ability to pipe it through BatchBlock and alike while still being able to acknowledge at the end)
- Async Enumerable #33
- Open Closed key & metadata convention #42
- Dynamic Partitioning & Sharding: With Strategy Lambda for the source name based on the metadata. For example: Tenant, Geo
- Forwarder #30
- Circuit-breaker + Dead stream #29
- Partial subscription #35
- Source Generation
- Producer
- Consumer
- Remove Reflection base
- Allow plan mutation on builder's Build()
- Metadata should be available via context (stateless style).
- Client Recovery
- Use Poly // https://github.com/App-vNext/Polly
- Producer - SendAsync()
- Consumer (Inject Poly's policy via the builder, into the plan)
- Use Poly // https://github.com/App-vNext/Polly
- Acknowledge
- Acknowledge on succeed
- Acknowledge for the message: can be available via context
- Pending & Claim
- Self Pending: should fetch - on first time, after error, first time when have no-element.
- Work stealing
- When pending of other consumers are lingered to long.
- Consumers should monitor for pending and claim long pending messages. (XCLAIM)
- Builder should enable different channel for messages & segments (actually it might be channel per segment key) for example S3
- Enable base channel for REDIS Stream + injection for handling different storage for segments & interceptors
- Integration Tests
- Should delete REDIS key on test disposal
- Should verify no pending messages
- Should ensure message sequence
- Test crash scenario when same consumer name continue after the crash
- Consumer should define Consumer Group [optional]
- REDIS Channel: Provider (stream + hash) (options: batch)
- Default Segmentation
- ProducerBase should be implemented as a unit of work in order to support multiple parameters
- Get logger
- use class instead of using Bucket = System.Collections.Immutable.ImmutableDictionary<string, System.ReadOnlyMemory>;
- execution pipeline (coordinates the different stages, interceptors, segmentation, channel)
build -> create metadata -> create channel (meta, segmentation) -> decorate {for interception} - Test Channel
- Merging implementation
- Abstract Factory concept for producer / consumer clients
- Custom Segmentation (for GDPR and alike)
- Custom Serialization
- Custom Channel
- Consume raw message: try to do it friendly for pattern matching: action name and case to type