Join us

Building a cost-effective logging platform using Clickhouse for petabyte scale

Building a cost-effective logging platform using Clickhouse for petabyte scale

Zomato's production generated over 50 TB of uncompressed logs per day, peaking at 150 million logs per minute. To handle this, they transitioned from Elasticsearch to Clickhouse, leveraging its horizontal scalability and low latency. Custom Golang workers efficiently batched log insertions, using AWS spot instances for cost savings and using a semi-structured schema to optimize data management. They also implemented query throttling mechanisms and advanced monitoring to ensure performance and resiliency.


Only registered users can post comments. Please, login or signup.

Start blogging about your favorite technologies, reach more readers and earn rewards!

Join other developers and claim your FAUN account now!

Avatar

The FAUN

@faun
A worldwide community of developers and DevOps enthusiasts!
User Popularity
3k

Influence

253k

Total Hits

1

Posts