H3 Resolution spec
Geo grid
Case Study
Density/presence aggregation, API, scaling.
150K
Events/minute
180ms
p95 latency
Client needed to visualize foot traffic patterns across hundreds of locations in real-time. Existing solutions couldn't handle the data volume or provide the granularity needed.
Built streaming pipeline with geo-resolution at multiple zoom levels. Pre-aggregated data into time buckets. Separated hot (recent) and cold (historical) storage. Implemented tiered caching with Redis.
Documents and deliverables from the project
H3 Resolution spec
Geo grid
API OpenAPI spec
REST v1
Cache strategy doc
Redis + ClickHouse
6-phase checklist before release
Handles 150K events/minute comfortably. 95th percentile query latency: 180ms. API serves 50+ third-party integrations. Reduced infrastructure cost 60% vs previous solution.
H3 hexagonal grid for consistent aggregation across zoom levels. Pre-compute at multiple resolutions to avoid runtime calculations.
Last 24h in Redis for instant access. 30 days in ClickHouse for fast analytics. Beyond that: compressed in S3, queryable via Athena. Tiering is automatic: a background worker promotes/demotes data based on age, and the API layer routes queries to the correct store transparently — the client sees one unified API regardless of where the data lives.
Heatmap tiles must update within 2 seconds of an event occurring, but naive per-event recalculation would collapse under load. We implemented a two-stage pipeline: (1) incoming events are batched in Redis Streams (50ms window), then (2) a Lua script atomically increments pre-computed H3 cell counters at multiple resolutions simultaneously. The frontend polls tile deltas (not full snapshots) via Server-Sent Events, reducing bandwidth by 94% compared to full-tile polling. At 50K events/second, tile latency stays under 1.8 seconds and CPU usage remains below 40% on a single 4-core instance.
Have a similar project? Get an estimate or book a call.