Project Metamorphosis : dévoilement de l'Event Streaming Platform nouvelle génération. En savoir plus

Apache Kafka Core Internals: A Deep Dive

In the last few years, Apache Kafka has been used extensively in enterprises for real-time data collecting, delivering, and processing. This talk will provide a deep dive on some of the key internals that help make Kafka popular. Companies like LinkedIn are now sending more than 1 trillion messages per day to Kafka. Learn about the underlying design in Kafka that leads to such high throughput. Many companies (e.g., financial institutions) are now storing mission critical data in Kafka. Learn how Kafka supports high availability and durability through its built-in replication mechanism. One common use case of Kafka is for propagating updatable database records. Learn how a unique feature called compaction in Apache Kafka is designed to solve this kind of problem more naturally.

Présentateur

Jun Rao

Jun Rao is the co-founder of Confluent, a company that provides a stream data platform on top of Apache Kafka. Before Confluent, Jun Rao was a senior staff engineer at LinkedIn where he led the development of Kafka. Before LinkedIn, Jun Rao was a researcher at IBM's Almaden research data center, where he conducted research on database and distributed systems. Jun Rao is the PMC chair of Apache Kafka and a committer of Apache Cassandra.