BIG DATA becomes FAST DATA.
More companies are using distributed streaming platforms that enable not only publish-and-subscribe, but also the storage and process data. Companies across all industries are now relying on real-time data to personalize customer experiences or detect fraudulent behavior.
On 30 October, Confluent, provider of a streaming platform based on Apache Kafka®, and mimacom, the data engineering specialist, invite you to Stuttgart for a three-hour workshop to demonstrate different use cases of streaming technology in large- and midsize- companies. These include examples from Audi, AirBnB and more, plus a detailed look at how job-room.ch is using Kafka as a backbone for Spring Microservices.
Please RSVP to reserve your seat, we only have limited spaces available!
15:30 - 16:00 Registration and Snacks
16:00 - 17:00 Real-time processing of large amounts of data: What is Streaming Platform? What is your Streaming Journey? How are companies using Apache Kafka and Confluent? - Marcus Urbatschek, Confluent
17:00 - 17:45 Kafka as a backbone for Spring Microservices (and Spring Cloud Dataflow) - Michael Wirth, mimacom
17:45 - 19:00 Your Streaming Journey, more use cases and open discussion - all
19:00 - 21:00 Get together, dinner, interactive Q&A
Real-time processing of large amounts of data: a streaming platform as the central nervous system in the enterprise
70190 Stuttgart / Mitte