Overview
The Signal Platform processes real-time events from multiple sources, aggregates signals, and distributes processed data to downstream consumers. It handles high-throughput event streams while maintaining low latency for time-sensitive operations.
Architecture
Built on edge workers for low-latency processing with durable object storage for state management. Events flow through a pipeline of transformation stages, each optimized for specific processing requirements.
Key Design Decisions
- Edge-first processing - Events are processed at the nearest edge location to minimize latency
- Stateless compute - Processing logic is stateless; state lives in purpose-built stores
- Schema-on-read - Flexible event schemas with validation at consumption time
- At-least-once delivery - Consumers handle idempotency; the platform guarantees delivery
Key Components
- Event Ingestion - HTTP endpoints for event collection with schema validation
- Signal Aggregation - Real-time aggregation pipelines with configurable windows
- Data Distribution - Push-based delivery to consumers via webhooks and streams
- Replay System - Historical event replay for debugging and reprocessing
Constraints
- Maximum event size: 64KB
- Processing latency target: <50ms p99
- Retention period: 30 days for raw events, 90 days for aggregates
Technology Stack
Cloudflare WorkersDurable ObjectsTypeScriptQueues
