View the recordings and slides from Kafka Summit 2020, the premier event for those who want to learn about streaming data.
Confluent is happy to announce that we will be providing new early release chapters of Kafka: The Definitive Guide v2 every month until the completion of the new e-book in Summer 2021.
Learn how to take full advantage of Apache Kafka®, the distributed, publish-subscribe queue for handling real-time data feeds.
Responsive, relevant, timely, insightful. Agencies are asking a lot of their data these days and treating it as a strategic asset. It’s a big job and a big change for agencies, which have been dealing with disconnected data silos, legacy applications and practices, and under-resourced data operations for decades. Making that shift from data as a passive to an active asset takes some work, but it pays off. In this report, you’ll learn how to use event streaming to process, store, analyze and act on both historical and real-time data in one place. You'll also explore: Data access and management challenges agencies are facing and how to address them. How the CDC tracked COVID test events to maximize value from COVID testing. Best practices on data analysis and productivity.
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
In this talk, we are going to observe the natural journey companies undertake to become real-time, the possibilities it opens for them, and the challenges they will face
Government agencies understand the need to augment traditional SIEM systems. And, with this knowledge comes the pressure to do so in a way that is better, faster, and cheaper than before.
This 60-minute online talk is packed with practical insights where you will learn how Kafka fits into a data ecosystem that spans a global enterprise and supports use cases for both data ingestion and integration
Wer in einem Unternehmen tätig ist, das von Automatisierung, IoT und Echtzeitdaten profitieren kann oder es bereits tut, sollte jetzt weiterlesen. Das Herzstück der Industrie 4.0 sind die Streaming-Daten.
To succeed, insurance companies must unify data from all their channels that may be scattered across multiple legacy systems as well as new digital applications. Without the ability to access and combine all this data in real time, delivering a truly modern insurance experience while assessing fast-changing risks can be an uphill battle. Our eBook explains how event streaming, an emerging technology for analyzing event data in real time, can help insurers compete with their insuretech peers. You will learn how combining event streaming from Apache Kafka® and Confluent with Google Cloud can help you.
To succeed, retailers must unify data from all their channels that may be scattered across point-of-sale, eCommerce, ERP, and other systems. Without integrating this data and making it available to applications in real time, it’s almost impossible to deliver a fully connected omnichannel customer experience. Our eBook explains how event streaming, an emerging technology for analyzing event data in real time, can make all the difference. You will learn how combining event streaming from Apache Kafka® and Confluent with Google Cloud can help you.
Banking customers today demand personalized service and expect real-time insight into their accounts from any device—and not just during “business hours.” With a centralized data architecture that enables event streaming, financial services companies can create all kinds of real-time applications that respond intelligently and automatically to data events right as they happen. To learn more about how event streaming enables higher customer engagement, AI automation, real-time analytics, and more, download this free Confluent ebook.
Most insurance companies today are somewhere along the spectrum of digital transformation, finding new ways to use data while staying within the confines of strict regulatory complexity and capital requirements. But only a few insurtech leaders and innovative startups have really tapped into real-time streaming data as the architecture behind these efforts. In this free ebook, learn about three pivotal insurance business uses for event streaming: reducing operating costs with automated digital experiences, personalizing the customer experience, and mitigating risks with real-time fraud and security analytics.
The future of retail hinges on the ability for brands to harness data not just from direct customer interactions but from numerous social media platforms, applications, and external sources.Yet, data is also the obstacle holding most retailers back from true transformation. To derive value from customer actions and transactions, brands must be able to capture events in real time and wield that information for specific actions. In this free ebook, learn about three pivotal retail business uses for event streaming: boosting sales with real-time personalization, creating consistent omnichannel experiences for customers, and increasing operational agility with real-time inventory.
In this ebook, you’ll learn about the adoption curve of event streaming and how to gain momentum and effect change within your organization. Learn how to wield event streaming to convert your enterprise to a real-time digital business, responsive to customers and able to create business outcomes in ways never before possible.
In this ebook, we cover five of the more common use cases Confluent has supported, with real-world customer examples and insights into how your organization can make the leap. You’ll get insight into how event streaming can help with use cases such as customer 360° and website clickstream analysis, legacy IT modernization, a single view of the business, next-gen apps, and real-time analytics. These are just a handful of the ways we’ve witnessed forward-thinking companies integrate event streaming into the core of their business models.
In this ebook, you’ll learn about the profound strategic potential in an event streaming platform for enterprise businesses of many kinds. The types of business challenges event streaming is capable of addressing include driving better customer experience, reducing costs, mitigating risk, and providing a single source of truth across the business. It can be a game changer.
We used to talk about the world’s collective data in terms of terabytes. Now, according to IDC’s latest Global Datasphere, we talk in terms of zettabtytes: 138Z of new data will be created in 2024—and 24% of it will be real-time data. How important is real-time streaming data to enterprise organizations? If they want to respond at the speed of business, it’s crucial. In this digital economy, having a competitive advantage requires using data to support quicker decision-making, streamlined operations, and optimized customer experiences. Those things all come from data.
At many financial services firms, data is scattered across multiple silos and may only be accessible through static, batched reports. Without unifying this data and making it available to applications in real time, it’s almost impossible to power AI and deliver an advanced, digital customer experience. Our eBook explains how event streaming, an emerging technology for analyzing event data in real time, can make all the difference. You will learn how combining event streaming from Apache Kafka® and Confluent with Google Cloud can help.
Streaming data is the glue that takes information from siloed systems such as retail banking, loan processing, and credit cards and combines it with mobile banking and website activity—without requiring any modification of the existing systems. For financial services, there’s a very clear advantage to linking together the data already being produced on legacy technologies.
Systems central to insurance such as underwriting, billing, and claims processing have always produced abundant data, but that data has traditionally been siloed and hard to connect. Streaming data platforms provide the missing link to connect rich data to new mobile applications and website activity without requiring insurance organizations to modify their existing systems.
Ventana Research finds that more than nine in ten organizations place a high priority on speeding the flow of information and improving the responsiveness of their organizations. This is where event streaming comes in. Even for retailers still beholden to legacy technologies, there are ways to connect data events in real time across the organization. Download this free report to learn more.
Ventana Research recommande aux organismes de services financiers d'exploiter les données en continu en temps de lecture pour transformer leurs opérations et améliorer leurs résultats.
La perspectiva de IDC sobre Confluent Platform 6.0 está aquí, y en ella, puede leer la perspectiva de IDC sobre la importancia de la transmisión de eventos para las empresas hoy en día, así como las principales recomendaciones, acciones y aspectos destacados de Confluent Platform 6.0.
Datenstreaming-Plattformen bilden die fehlende Verbindung um diese komplexen Daten (“Rich Data”) mit neuen Apps und Website-Aktivitäten zu kombinieren, ohne dass Änderungen an den vorhandenen Systemen erforderlich sind.
Die Forschung von Ventana Research hat ergeben, dass neun von zehn Unternehmen großen Wert darauf legen, den Informationsfluss zu beschleunigen und die Reaktionszeit ihrer Systeme zu verkürzen.
Die Empfehlung von Ventana Research ist, dass Banken und Finanzdienstleister Echtzeitdaten dazu nutzen, ihre Abläufe zu transformieren und Erträge zu erhöhen.
Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021
Les systèmes centraux de l'assurance tels que la souscription, la facturation et le traitement des sinistres ont toujours produit des volumes de données énormes, mais ces données ont traditionnellement été mises en silo et difficiles à relier. Les plateformes de données en temps réel constituent le chaînon manquant pour connecter des données riches à de nouvelles applications mobiles et à l'activité des sites web sans que les organismes d'assurance aient à modifier leurs systèmes existants.
Ventana Research constate que plus de neuf organisations sur dix accordent une grande priorité à l'accélération du flux d'informations et à l'amélioration de la réactivité de leurs organisations.
Hands-on workshop: Using Kubernetes, Spring Boot, Kafka Streams, and Confluent Cloud to rate Christmas movies.
Découvrez le retour d'IDC sur Confluent Platform 6.0 ici, et retrouvez-y le point de vue d'IDC sur l'importance du streaming d'événements pour les entreprises d'aujourd'hui.
IDC Perspective wirft einen Blick auf die Highlights von Confluent Platform 6.0 und erklärt die Bedeutung von Event-Streaming für das moderne Unternehmen.
In diesem E-Book werden die fünf häufigsten Use Cases für Event-Streaming vorgestellt, inklusive Kundenbeispielen aus der Praxis und Best Practices für die unternehmensinterne Transformation.
Dieses E-Book stellt die typische Adoptionskurve von Event-Streaming im Unternehmen vor und zeigt Beispiele, wie Schritt für Schritt Veränderungen in der Organisation umgesetzt werden könne
In diesem E-Book erfahren Sie, wie viel strategisches Potenzial für Unternehmen jeder Größe in einer Event-Streaming-Plattform steckt
Dans ce livre électronique, vous découvrirez la courbe d'adoption du streaming d'événements ainsi que la meilleure façon de stimuler son adoption et d'implémenter ce changement au sein de votre organisation.
Dans ce livre électronique, nous vous présentons cinq cas d'utilisation couramment traités avec Confluent, avec des exemples concrets de clients et des idées sur la façon dont votre organisation peut réaliser cette transformation.
Dans ce livre électronique, vous découvrirez le potentiel stratégique immense que le streaming d'événements offre aux entreprises commerciales de toutes sortes.
Learn how Apache Kafka, Confluent, and event-driven microservices ensure real-time communication and event streaming for modernized deployment, testing, and continuous delivery.
In Live-Demos erklären wir, welchen Mehrwert Audit Logs in Confluent Platform 6.0
Basierend auf dem Überblicks-Webinar zeigen wir einen detaillierten Einblick in weitere neue Funktionalitäten von Confluent Platform 6.0.
If you’re a leader in a business that could or does benefit from automation, IoT, and real-time data, don’t miss this white paper. The lifeblood of Industry 4.0 is streaming data, which is where event streaming comes in: the real-time capture, processing, and management of all your data in order to drive transformative technology initiatives.
Der Aufbau und die Skalierung event-getriebener Anwendungen ist eine echte Herausforderung, da sich die Quellen für Event-Daten oft über mehrere Rechenzentren, Clouds, Microservices und stark verteilte Umgebungen erstrecken.
In this two-hour spooktacular workshop with Bruce Springstreams, learn about event-driven microservices with Spring BOOOOt and Confluent Cloud.
Basierend auf dem Überblicks-Webinar zeigen wir einen detaillierten Einblick in drei neue Funktionalitäten von Confluent Platform 6.0
For financial services companies, digital technologies can solve business problems, drastically improve traditional processes, modernize middleware and front-end infrastructure, improve operational efficiency, and most importantly, better serve customers. The advent of real-time stock trading, predictive analytics and risk modeling, and integration of data with artificial intelligence (AI) to prevent fraud are all examples of how fintechs are innovating on a backbone of digital. To be able to “think big” in financial services is critical to staying competitive on the spectrum of digital transformation. But digital transformation has become a catchall phrase, often misused. There’s a difference between a company that uses a lot of different kinds of software in an ad hoc way and one that puts digital transformation at the center of its strategy. In this paper, read about: The digital trajectory of financial services today and the biggest obstacles to digital transformation The importance of centralized data that can be used for real-time event streaming Real-life success stories and use cases to illustrate how fintechs are transforming Learn how real-time data and event streaming can transform your business.
Hear from Intrado’s Thomas Squeo, CTO, and Confluent’s Chief Customer Officer, Roger Scott, to learn how Intrado future-proofed their architecture to support current and future real-time business initiatives.
Technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
Confluent Cloud enabled the company to get started quickly, minimize operational overhead, and reduce engineering effort.
In this Online Talk Henrik Janzon, Solutions Engineer at Confluent, explains Apache Kafka’s internal design and architecture.
In diesem Online Talk werden die wichtigsten Funktionalitäten der neuesten Version Confluent Platform 6.0 behandelt, darunter viele Bestandteile des Project Metamorphosis.
The IDC Perspective on Confluent Platform 6.0 is here, and in it, you can read IDC’s lens on the importance of event streaming to enterprise companies today.
In this talk, we are going to show some example use cases that Data Reply developed for some of its customers and how Real-Time Decision Engines had an impact on their businesses.
View the recordings and slides from Kafka Summit 2020, the premier event for those who want to learn about streaming data.
Confluent is happy to announce that we will be providing new early release chapters of Kafka: The Definitive Guide v2 every month until the completion of the new e-book in Summer 2021.
In this webinar, we take a hands-on approach to these questions and walk through setting up a simple application written in .NET to a Confluent Cloud based Kafka cluster. Along the way, we point out best practices for developing and deploying applications that scale easily.
Confluent implements layered security controls designed to protect and secure Confluent Cloud customer data, incorporating multiple logical and physical security controls that include access management, least privilege, strong authentication, logging and monitoring, vulnerability management, and bug bounty programs.
Replace the mainframe with new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey. This session will guide you to the next step of your company’s evolution!
This ENTERPRISE MANAGEMENT ASSOCIATES® (EMA™) eBook will show how, with fully managed cloud-based event streaming, executives, managers, and individual contributors gain access to real-time intelligence and the enterprise will achieve unprecedented momentum and material gain. At the core of the event-centric enterprise, event streaming platforms become the nervous system of business ecosystems.
Comment Michelin a réussi à péréniser son infrastructure informatique pour les années à venir.
Comment Michelin a réussi à péréniser son infrastructure informatique pour les années à venir.
Databases represent some of the most successful software that has ever been written and their importance over the last fifty years is hard to overemphasize. Over this time, they have evolved to form a vast landscape of products that cater to different data types, volumes, velocities, and query characteristics. But the broad definition of what a database is has changed relatively little.
Découvrez comment Accor a révolutionné son infrastructure système autour du streaming événementiel avec Apache Kafka et Confluent.
Décuplez la valeur du temps réel pour votre entreprise avec Google Cloud & Confluent.
Event streaming: from technology to a completely new business paradigm.
Découvrez comment Nexthink a redynamisé l'engagement et l'expérience employés avec le streaming événementiel.
La nouvelle génération de gestion de système bancairen, vue au travers d'Apache Kafka & Confluent : découvrez l'expérience de BNP Paribas.
Event Streaming Paradigm: rethink data as not stored records or transient messages, but instead as a continually updating stream of events.
Pourquoi l'utilisation d'Apache Kafka et du streaming événermentiel représente une révolution pour les entreprises.
You know the fundamentals of Apache Kafka. You are a Spring Boot developer and working with Apache Kafka. You have chosen Spring Kafka to integrate with Apache Kafka. You implemented your first producer, consumer, and maybe some Kafka streams, it's working... Hurray! You are ready to deploy to production what can possibly go wrong?
De nombreuses entreprises collectent et stockent leurs données dans divers centres de données et utilisent de multiples applications et services commerciaux pour y accéder, les analyser et agir en conséquence. Gérer cette montagne de données, qui proviennent toutes de sources disparates est très difficile. Bien souvent, les méthodes employées sont inefficaces et produisent peu de résultats.
Nos encontramos ante el desafío de tomar decisiones utilizando datos distribuidos en entornos dispares y heterogéneos, lo que trae como consecuencia que sea muy complicado y complejo tomar acciones adecuadas. La extracción de datos de diferentes ambientes (nube, VPC y On-Premise) es difícil de manejar, ineficiente e ineficaz si se quiere producir resultados confiables y que posean valor para la organización.
Learn how NAV (Norwegian Work and Welfare Department) are using Apache Kafka to distribute and act upon events. NAV currently distributes more than one-third of the national budget to citizens in Norway or abroad. They are there to assist people through all phases of life within the domains of work, family, health, retirement, and social security. Events happening throughout a person’s life determines which services NAV provides to them, how they provide them, and when they offer them.
Cette série de webinars en quatre parties donne un aperçu de ce qu'est Kafka, de ses usages et des concepts fondamentaux qui lui permettent d'alimenter une plateforme de streaming d'événements en temps réel hautement évolutive, disponible et résiliente.
Die schnell wachsende Welt der Datenstromverarbeitung kann entmutigend sein, da neue Konzepte wie verschiedene Arten von Zeitsemantiken, Aggregationen, Change Logs und Frameworks zu beherrschen sind. KSQL ist eine Open-Source, Apache 2.0 lizenzierte Streaming-SQL-Engine basierend auf Apache Kafka, die all dies vereinfacht und die Stream-Verarbeitung für jeden verfügbar macht, ohne dass Source Code geschrieben werden muss.
Eine weitere neue sicherheitsrelevante Funktion in Confluent Platform 5.4 sind Structured Audit Logs. Jetzt ist natürlich alles in Kafka ein Log, aber Kafka protokolliert nicht, was Kafka mit Kafka macht - nur das, was in einen Topics geschrieben wird.
In the world of online streaming providers, real-time events are becoming the new standard, driving innovation and a new set of use cases to react to a quickly changing market. We explain how, from simple media player heartbeats, Data Reply fueled a diverse set of near-real-time use cases and services for his customer, from blocking concurrent media streams, to recognizing ended sessions and trending content.
Large enterprises, government agencies, and many other organisations rely on mainframe computers to deliver the core systems managing some of their most valuable and sensitive data. However, the processes and cultures around a mainframe often prevent the adoption of the agile, born-on-the web practices that have become essential to developing cutting edge internal and customer-facing applications.
A company's journey to the cloud often starts with the discovery of a new use case or need for a new application. Deploying Confluent Cloud, a fully managed cloud-native streaming service based on Apache Kafka, enables organisations to revolutionise the way they build streaming applications and real-time data pipelines.
In unserem digitalen Zeitalter von Big Data und IoT, in dem täglich mehrere Trillionen Byte an Daten produziert werden, ist es für Unternehmen von ganz erheblicher Bedeutung, die richtigen Daten, zur richtigen Zeit bereit zu haben – egal in welcher Applikation und unabhängig davon, ob in der Cloud oder on-premise.
Oggi, molto spesso, i dati aziendali risiedono su diverse applicazioni, con logiche di rappresentazione e di gestione degli stessi dati, troppo eterogenee. Laddove si richiede il consumo di questi dati sparsi, per un uso centralizzato, il processo di estrazione da fonti cosí disparate, diventa difficile da gestire ed estremamente inefficiente.
¿Adminstras un cluster de Kafka? ¿Dominas las funcionalidades básicas pero quieres ir más allá en el streaming de tu datos? En este webinar para usuarios avanzados de Kafka, discutiremos las posibilidades de procesamiento ofrecidas por Kafka Streams & ksqlDB a través de ejemplos y casos de uso así como las mejores prácticas a implementar cuando nos involucramos en un proyecto o iniciativa con estas tecnologías.
Aproveche el potencial de sus datos acoplando soluciones de gestión avanzada de APIs con una arquitectura de eventos Kafka. Abrir acceso a los datos adecuados en tiempo real, desde cualquier lugar y en el momento exacto en que se necesiten, se ha convertido en un gran reto para los CIOs en la digitalización de las interconexiónes.
Massimizzare il potenziale dei dati accoppiando le soluzioni di API management con un'architettura di eventi Kafka e permettere l'accesso ai dati corretti in tempo reale, da qualsiasi luogo e nel momento esatto in cui è necessario, sono diventate una grande sfida per i CIO all'interno della loro trasformazione digitale.
Apache Kafka® ist eine Streaming-Plattform, die entscheidende Geschäftsereignisse aus jeglichen Bereichen eines Unternehmens zu einer Art zentralem Nervensystem vereint, das sämtliche relevanten Aktivitäten in Form von Event-Datenströmen zusammenfasst.
Im zweiten Teil der Deep Dive Sessions besprechen wir den Aufbau eines logischen Clusters über mehrere Regionen, der alle Ihre HA SLAs (Bronce, Silver, Gold) abdeckt. Außerdem behandeln wir das große Potenzial zum Hochladen aller Ihrer Anwendungsfälle in einem Multi-Tenant-Cluster Ihrer Organisation.
Learn how Apache Kafka and Confluent help the gaming industry leverage real-time integration, event streaming, and data analytics for seamless gaming experiences at scale.
Apache Kafka is an open source event streaming platform. It is often used to complement or even replace existing middleware to integrate applications and build microservice architectures. Apache Kafka is already used in various projects in almost every bigger company today. Understood, battled-tested, highly scalable, reliable, real-time. Blockchain is a different story. This technology is a lot in the news, especially related to cryptocurrencies like Bitcoin. But what is the added value for software architectures? Is blockchain just hype and adds complexity? Or will it be used by everybody in the future, like a web browser or mobile app today? And how is it related to an integration architecture and event streaming platform? This session explores use cases for blockchains and discusses different alternatives such as Hyperledger, Ethereum and a Kafka-native tamper-proof blockchain implementation. Different architectures are discussed to understand when blockchain really adds value and how it can be combined with the Apache Kafka ecosystem to integrate blockchain with the rest of the enterprise architecture to build a highly scalable and reliable event streaming infrastructure. Speakers: Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
ksqlDB ermöglicht es, Event-Streaming-Anwendungen genau so leicht und mit bekannten Mitteln zu erstellen, wie herkömmliche Anwendungen auf einer relationalen Datenbank. Kombiniert man dies mit Confluent Cloud, ergeben sich viele neue spannende Möglichkeiten. Am Beispiel eines realen Use Cases wird Live im Webinar eine Umgebung geschaffen, die ein Real-Time Event Streaming Cluster aufbaut. Events werden in Echtzeit verarbeitet und analysiert. Zur Analyse der Events werden wir eine sql-ähnliche Sprache verwenden. Hierfür nutzen wir ksqlDB. Dieser Use Case soll aufzeigen, wie einfach es ist, reale und performante Umgebungen in der Confluent Cloud in kurzer Zeit zur Verfügung zu stellen und Analysen sofort zu starten. Sprecher: Carsten Mützlitz, Solution Engineer, Confluent
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming.
Developing a streaming solution working against a self-managed Kafka cluster, can be awkward and time consuming, largely due to security requirements and configuration red-tape. It's beneficial to use Confluent Cloud in the early stages to get quick progress. Creating the cluster in Confluent Cloud is super easy and allows you to concentrate on defining your Connect sources and sinks as well as fleshing out the streaming topology on your laptop. It also shows the client how easy it is to swap out the self-managed Kafka cluster with Confluent Cloud.
Without any coding or scripting, end-users leverage their existing spreadsheet skills to build customized streaming apps for analysis, dashboarding, condition monitoring or any kind of real-time pre-and post-processing of Kafka or KsqlDB streams and tables.
Join Kai Waehner, Technology Evangelist at Confluent, for this session which explores various telecommunications use cases, including data integration, infrastructure monitoring, data distribution, data processing and business applications. Different architectures and components from the Kafka ecosystem are also discussed.
Die Anpassung an die Echtzeit-Anforderungen unternehmenskritischen Anwendungen ist nur mit einer Architektur möglich, die sich elastisch skalieren lässt. Confluent hat Apache Kafka zu einer elastisch skalierbaren Event-Streaming-Plattform erweitert, die Echtzeitdaten verarbeitet, wo immer sie sich befinden. Damit wird Event-Streaming für jedes Budget und jeden Anwendungsfall zugänglich.
In this webinar we want to share our experience on how the Swiss Mobiliar, the biggest Swiss household insurance enterprise, introduced Kafka and led it to enterprise-wide adoption with the help of AGOORA.com.
Le serving de modèle de Machine Learning pour la prédiction en temps réel présente des défis tant en Data Engineering qu'en Data Science. Comment construire un pipeline moderne qui permet de réaliser des prédictions en continu ? Dans le cas d'un exercice supervisé, comment allier tracing et tracking des performances ? Comment récupérer un feedback pour déclencher un réentraînement réactif ? Dans ce talk nous vous proposons de dresser, ensemble, une proposition concrète de pipeline, qui prend en compte les phases d'exploration et de monitoring dans un contexte temps réel. Les ingrédients : un event log, une plateforme notebook et d'autres surprises nous venant tout droit du Cloud.
Mit Confluent Platform 5.5 machen wir es Entwicklern noch einfacher, Zugang zu Kafka zu finden und mit der Erstellung von Event-Streaming-Anwendungen zu beginnen, unabhängig von der bevorzugten Programmiersprache oder den zugrunde liegenden Datenformaten, die in ihren Anwendungen verwendet werden.
Einzelhändler können ab sofort die Kaufabsicht der Kunden vorhersehen, als Reaktion auf einen Verkauf sofort den Bestand nachbestellen und neue Geschäfte in einem Bruchteil der Zeit integrieren. Und das ist nur die Spitze des Eisbergs...in diesem Vortrag wird Carsten einige Ideen rund um die Apache Kafka Streaming-Plattform im Einzelhandel vorstellen und einige davon auch live zeigen.
Join this Online Talk, to understand how and why Apache Kafka has become the de-facto standard for reliable and scalable streaming infrastructures in the finance industry.
This document provides an overview of Confluent and Snowflake’s integration, a detailed tutorial for getting started with the integration, and unique considerations to keep in mind when working with these two technologies.
TCO is the total cost of ownership, calculating out purchase price plus costs to operate. A comprehensive TCO assessment should factor in time, manpower, and other costs across an entire organization over time. Open source vs SaaS is a common topic, with open source software claiming to be "free," and vendors making claims around their software reducing TCO, or improving ROI. But how do you measure these claims? What are the assumptions? How robust are the estimates? This white paper provides answers to these questions for Confluent Cloud by outlining the model we use, our approach, and a customer example, sharing lessons learned along the way.
Dieses COMPUTERWOCHE-Whitepaper stellt Streaming-Plattformen vor, zeigt deren Business-Nutzen und präsentiert eine Reihe von Anwendungen.
Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.
Join Unity, Confluent and GCP to learn how to reduce risk and increase business options with a hybrid cloud strategy.
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
This white paper reports the results of benchmarks we ran on a 2-CKU multi-zone dedicated cluster and shows the ability of a CKU to deliver the stated client bandwidth on AWS, GCP, and Azure clouds.
Explore the use cases and architecture for Apache Kafka®, and how it integrates with MongoDB to build sophisticated data-driven applications that exploit new sources of data.
Experts from Confluent and Attunity share how you can: realize the value of streaming data ingest with Apache Kafka®, turn databases into live feeds for streaming ingest and processing, accelerate data delivery to enable real-time analytics and reduce skill and training requirements for data ingest.
Get answers to: How you would use Apache Kafka® in a micro-service application? How do you build services over a distributed log and leverage the fault tolerance and scalability that comes with it?
Get an introduction to Apache Kafka® and how it serves as a foundation for streaming data pipelines and applications that consume/process real-time data streams. Part 1 in the Apache Kafka: Online Talk Series.
In this talk by Jun Rao, co-creator of Apache Kafka®, get a deep dive on some of the key internals that makes Apache Kafka popular, including how it delivers reliability and compaction. Part 2 in the Apache Kafka: Online Talk Series.
Learn different options for integrating systems and applications with Apache Kafka® and best practices for building large-scale data pipelines using Apache Kafka. Part 3 in the Apache Kafka: Online Talk Series.
Learn typical use cases for Apache Kafka®, how you can get real-time data streaming from Oracle databases to move transactional data to Kafka and enable continuous movement of your data to provide access to real-time analytics.
Learn how to map practical data problems to stream processing and write applications that process streams of data at scale using Kafka Streams. Part 4 in the Apache Kafka: Online Talk Series.
In this talk, we survey the stream processing landscape, the dimensions along which to evaluate stream processing technologies, and how they integrate with Apache Kafka®. Part 5 in the Apache Kafka: Online Talk Series.
This talk focuses on how to integrate all the components of the Apache Kafka® ecosystem into an enterprise environment and what you need to consider as you move into production. Part 6 of the Apache Kafka: Online Talk Series.
This talk will examine the underlying dichotomy we all face as we piece such systems together--one that is not well served today. The solution lies in blending the old with the new and Apache Kafka® plays a central role. Part 1 in the Apache Kafka for Microservices: A Confluent Online Talk Series.
This practical talk will dig into how we piece services together in event driven systems, how we use a distributed log to create a central, persistent narrative and what benefits we reap from doing so. Part 2 in the Apache Kafka® for Microservices: A Confluent Online Talk Series.
This talk will look at how Stateful Stream Processing is used to build truly autonomous services. With the distributed guarantees of Exactly Once Processing, Event-Driven Services supported by Apache Kafka®. Part 3 in the Apache Kafka for Microservices: A Confluent Online Talk Series.
Join us as we walk through an overview of this exciting new service from the experts in Kafka. Learn how to build robust, portable and lock-in free streaming applications using Confluent Cloud.
Neha Narkhede talks about the experience at LinkedIn moving from batch-oriented ETL to real-time streams using Apache Kafka and how the design and implementation of Kafka was driven by this goal of acting as a real-time platform for event data.
Learn about the recent additions to Apache Kafka® to achieve exactly-once semantics (EoS) including support for idempotence and transactions in the Kafka clients.
Microservices guru Sam Newman, Buoyant CTO Oliver Gould and Apache Kafka® engineer Ben Stopford are joined by Jay Kreps, co-founder and CEO, Confluent for a Q&A session where they discuss and debate all things Microservices.
Join the discussion on the relationship between microservices and stream processing with Data-Intensive Apps author Martin Kleppmann, Confluent engineers Damian Guy and Ben Stopford, chaired by Jay Kreps, co-founder and CEO, Confluent.
Learn about the KSQL architecture and how to design and deploy interactive, continuous queries for streaming ETL and real-time analytics.
In this talk, Gwen Shapira describes the reference architecture of Confluent Enterprise, which is the most complete platform to build enterprise-scale streaming pipelines using Apache Kafka®. Part 1 in the Best Practices for Apache Kafka in Production Series.
In this session, we go over everything that happens to a message – from producer to consumer, and pinpoint all the places where data can be lost. Build a bulletproof data pipeline with Apache Kafka. Part 2 in the Best Practices for Apache Kafka in Production Series.
In this session, we discuss the basic patterns of multi-datacenter Apache Kafka® architectures, explore some of the use cases enabled by each architecture and show how Confluent Enterprise products make these patterns easy to implement. Part 3 in the Best Practices for Apache Kafka in Production Series.
In this session, we discuss disaster scenarios that can take down entire Apache Kafka® clusters and share advice on how to plan, prepare and handle these events. Part 4 in the Best Practices for Apache Kafka in Production Series.
In this presentation, we discuss best practices of monitoring Apache Kafka®. Part 5 of the Best Practices for Apache Kafka in Production series.
Tim Berglund covers the patterns and techniques of using KSQL. Part 1 of the Empowering Streams through KSQL series.
Join us as we build a complete streaming application with KSQL. There will be plenty of hands-on action, plus a description of our thought process and design choices along the way. Part 2 in the Empowering Streams through KSQL series.
In this session, Nick Dearden covers the planning and operation of your KSQL deployment, including under-the-hood architectural details. Part 3 out of 3 in the Empowering Streams through KSQL series.
In this talk, members of the Pinterest team offer lessons learned from their Confluent Go client migration and discuss their use cases for adopting Kafka Streams.
Dank Apache Kafka® können Unternehmen Echtzeitdaten optimal nutzen. Im Webinar erfahrt ihr mehr über Daten-Streaming und wie es Entwicklungskosten reduzieren kann.
Im gemeinsamen Online Talk von Confluent und Attunity zeigen wir, wie die Bereitstellung von Daten beschleunigt werden kann, um Echtzeitanalysen mit Live-Daten aus vielen Quellen zu ermöglichen.
Dans ce wébinaire présenté par Confluent et Attunity, découvrez comment le streaming de données peut être accéléré afin de permettre des analyses en temps réel avec des données en direct provenant de nombreuses sources.
In this interactive discussion, the KSQL team will answer 10 of the toughest, most frequently asked questions about KSQL.
In diesem Online Talk zeigen wir gibt eine kurze Einführung in Apache Kafka und die Verwendung als Daten-Streaming-Plattform.
Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content while still maintaining searchability, accuracy and accessibility—all through the power of a real-time streaming platform.
Joignez-vous à notre série de conférences en ligne en deux parties, accélérez les flux via KSQL pour plonger au coeur de cet outil. Nos experts vous expliqueront l’architecture du moteur KSQL et montreront comment concevoir et déployer des requêtes interactives et continues.
Joignez-vous à notre série de conférences en ligne en deux parties, accélérez les flux via KSQL pour plonger au coeur de cet outil. Nos experts vous expliqueront l’architecture du moteur KSQL et montreront comment concevoir et déployer des requêtes interactives et continues.
Gwen Shapira presents core patterns of modern data engineering and explains how you can use microservices, event streams and a streaming platform like Apache Kafka to build scalable and reliable data pipelines. Part 1 of 3 in Streaming ETL - The New Data Integration series.
In this online talk, Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent and Kafka PMC member, will help you navigate through the hype, address frequently asked questions and deliver critical information to help you decide if running Kafka on Kubernetes is the right approach for your organization.
Capital One supports interactions with real-time streaming transactional data using Apache Kafka®. Join us for this online talk on lessons learned, best practices and technical patterns of Capital One’s deployment of Apache Kafka.
Join experts from VoltDB and Confluent to see why and how enterprises are using Apache Kafka as the central nervous system in combination with VoltDB.
In this talk, we'll build a streaming data pipeline using nothing but our bare hands, the Kafka Connect API and KSQL.
We’ll discuss how to leverage some of the more advanced transformation capabilities available in both KSQL and Kafka Connect. Part 3 of 3 in Streaming ETL - The New Data Integration online talk series.
The ‘current state of stream processing’ walks through the origins of stream processing, applicable use cases and then dives into the challenges currently facing the world of stream processing as it drives the next data revolution.
In this talk we will look at what event driven systems are; how they provide a unique contract for services to communicate and share data and how stream processing tools can be used to simplify the interaction between different services.
Watch Lyndon Hedderly's keynote from Big Data Analytics London 2018.
In this online talk, Technology Evangelist Kai Waehner will discuss and demo how you can leverage technologies such as TensorFlow with your Kafka deployments to build a scalable, mission-critical machine learning infrastructure for ingesting, preprocessing, training, deploying and monitoring analytic models.
See how Kinetica enables businesses to leverage the streaming data delivered with Confluent Platform to gain actionable insights.
Confluent Co-founder Jun Rao discusses how Apache Kafka® became the predominant publish/subscribe messaging system that it is today, Kafka's most recent additions to its enterprise-level set of features and how to evolve your Kafka implementation into a complete real-time streaming data platform.
There’s a prevailing enterprise perception that compliance with data protection regulations and standards is a burden: limiting the leverage of data.
Modern streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, can help companies catch and detect fraud in real time instead of after the fact.
With the evolution of data-driven strategies, event-based business models are influential in innovative organizations.
What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
Learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®.
Rabobank rose to this challenge and defined the Business Event Bus (BEB) as the place where business events from across the organization are shared between applications.
In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to AWS.
Learn how Generali Switzerland set up an event-driven architecture to support their digital transformation project.
This talk will cover how to integrate real-time analytics and visualizations to drive business processes and how KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time.
Detecting fraudulent activity in real time can save a business significant amounts of money, but has traditionally been an area requiring a lot of complex programming and frameworks, particularly at scale.
This online talk includes in depth practical demonstrations of how Confluent and Panopticon together support several key financial services and IoT applications, including transaction cost analysis and risk monitoring.
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to GCP.
This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds.
This talk looks at one of the most common integration requirements – connecting databases to Apache Kafka.
In this all too fabulous talk, we will be addressing the wonderful and new wonders of KSQL vs. KStreams and how Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products.
In this online talk, you will learn why, when facing Open Banking regulation and rapidly increasing transaction volumes, Nationwide decided to take load off their back-end systems through real-time streaming of data changes into Apache Kafka®.
In this session, we'll compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases.
Confluent KSQL is the streaming SQL engine that enables real-time data processing against Apache Kafka®. It provides an easy-to-use, yet powerful interactive SQL interface for stream processing on Kafka.
In diesem Webinar zeigen wir, wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich "Connected Car“ revolutioniert.
Erfahren Sie in diesem Online Talk von unserem Kafka-Experten wie leicht man Apache Kafka und Confluent Platform auf Kubernetes.
In this session, we will cover the easiest ways to start developing event-driven applications with Apache Kafka using Confluent Platform.
This talk explores the benefits around cloud-native platforms and running Apache Kafka on Kubernetes, what kinds of workloads are best suited for this combination, and best practices.
Real-time data has value. But how do you quantify that value. This talk explores why valuing Kafka is important - but covers some of the problems in quantifying the value of a data infrastructure platform.
This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics.
This online talk is based on real-world experience of Kafka deployments and explores a collection of common mistakes that are made when running Kafka in production and some best practices to avoid them.
This interactive whiteboard presentation discusses use cases leveraging the Apache Kafka® open source ecosystem as an event streaming platform to process IoT data.
This talk discusses the key design concepts within Apache Kafka Connect and the pros and cons of standalone vs distributed deployment modes.
This talk provides a deep dive into the details of the rebalance protocol, starting from its original design in version 0.9 up to the latest improvements and future work.
Confluent Platform 5.3 ist GA und wir freuen uns über viele neue Funktionen. Wir sprechen unter anderem über den Confluent Operator (also Kafka auf Kubernetes), Ansible Playbooks und die neue UI des Confluent Control Centers.
This session shows how various sub-systems in Apache Kafka can be used to aggregate, integrate and attribute these signals into signatures of interest.
This online talk focuses on the key business drivers behind connecting to Kafka and introduces the new Confluent Verified Integrations Program. Part 1 of 2 in Building Kafka Connectors - The Why and How
This online talk dives into the new Verified Integrations Program and the integration requirements, the Connect API and sources and sinks that use Kafka Connect. Part 2 of 2 in Building Kafka Connectors - The Why and How
This talk showcases different use cases in automation and Industrial IoT (IIoT) where an event streaming platform adds business value.
Learn how Centene improved their ability to interact and engage with healthcare providers in real time with MongoDB and Confluent Platform.
En este webinar en castellano describiremos porqué y para qué existe Apache Kafka. Veremos algunos de los usos más habituales, y analizaremos los tres componentes básicos la plataforma.
This talk explains how companies are using event-driven architecture to transform their business and how Apache Kafka serves as the foundation for streaming data applications. Part 1 of 4 in our Fundamentals for Apache Kafka series
This session explains Apache Kafka’s internal design and architecture. Companies like LinkedIn are now sending more than 1 trillion messages per day to Apache Kafka. Part 2 of 4 in our Fundamentals for Apache Kafka series.
In diesem Webinar stellen wir die neuen Security-Komponenten (Role Based Access Control -RBAC und Secret Protection) von Confluent Platform live vor und gehen auch auf die Best Practices in diesem Umfeld ein.
Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer.
This session will show you how to get streams of data into and out of Kafka with Kafka Connect and REST Proxy, maintain data formats and ensure compatibility with Schema Registry and Avro, and build real-time stream processing applications with Confluent KSQL and Kafka Streams.
In this technical deep dive, we’ll discuss the proposition of Incremental Cooperative Rebalancing as a way to alleviate stop-the-world and optimize rebalancing in Kafka APIs.
In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
Learn how AO.com are enabling real-time event-driven applications to improve customer experience using Confluent Platform.
This talk takes an in-depth look at how Apache Kafka® can be used to provide a common platform on which to build data infrastructure driving both real-time analytics as well as event-driven applications.
In questo webinar approfondiremo KSQL con degli esempi in modalitá "live coding" e vedremo in pratica quanto è semplice creare e rilasciare in produzione applicazioni streaming.
Join the Confluent Product team as we provide a technical overview of Confluent Platform 5.4, which delivers groundbreaking enhancements in the areas of security, disaster recovery and scalability.
En este webinar, descubre las nuevas características de la versión 5.4 de la plataforma Confluent así como sus aplicaciones y casos de uso reales.
In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
During this online talk, presenters from Confluent and Qlik will demonstrate how to accelerate data delivery to enable real-time analytics, make data more valuable with real-time data ingestion to Kafka, modernize data centers by streaming data in real-time, and demo a customer use case for advanced analytics.
In this online talk, we introduce Apache Kafka® and the MongoDB connector for Kafka, and demonstrate a real world stock trading use case that joins heterogeneous data sources to find the moving average of securities using Apache Kafka and MongoDB.
Confluent Technology Evangelist Kai Waehner zeigt in diesem Online Talk die wichtigsten Änderungen in den Bereichen Sicherheit, Disaster Recovery und Skalierbarkeit.
Operating a complex distributed system such as Apache Kafka could be a lot of work. In this talk we will review common issues, and mitigation strategies, seen from the trenches helping teams around the globe with their Kafka infrastructure.
We explain how the microservice ecosystem around Apache Kafka was built to ensure the ability to build and deploy new streaming agents on AWS fast and with the least amount of operational effort possible, as well as some of the issues we found and worked around.
Die Bereitstellung und der Betrieb eines verteilten und skalierbaren Datensystems kann eine Herausforderung darstellen. Im ersten Teil der Deep Dive Sessions werfen wir einen detaillierten Blick auf die Centralized ACLs und Role Based Access Control, inklusive einer Live-Demo im Confluent Control Center (C3).
Industry 4.0 and smart manufacturing are driving the manufacturing industry to modernize their software infrastructure. This session will look at the unique business drivers for modernizing the manufacturing industry and how MQTT and Kafka can help make it a reality.
In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming.
This reference architecture documents the MongoDB and Confluent integration including detailed tutorials for getting started with the integration, guidelines for deployment, and unique considerations to keep in mind when working with these two technologies.
Thomas Trepper, Training Delivery Manager EMEA, zeigt in diesem kurzen Webinar die Trainings-Optionen für Kafka-Entwickler und -Operator sowie die Möglichkeiten zur Zertifizierung. Alle Online-Trainings und Details findet ihr hier: https://www.confluent.io/training/
Robin discusses the role of Apache Kafka as the de facto standard streaming data processing platforms.
This session covers architectures best practises and recommendations for organisations aiming for a more cloud-centric approach in the use of Apache Kafka.
Spending time with many OEMs and suppliers as well as technology vendors in the IoT segment, Kai Waehner gives an overview on current challenges in the automotive industry and on a variety of use cases for event-driven architectures.
The Confluent event-streaming platform enables government organizations to unlock and repurpose their existing data for countless modern applications and use cases.
Damit agile Prozesse sowie Echtzeit-Entscheidungen möglich werden, ist eine Event-Streaming-Architektur zwingend erforderlich.
Learn how CDC (Change Data Capture) captures database transactions for ingest into Confluent Platform to enable real-time data pipelines.
This brief describes a solution for real-time data streaming with ScyllaDB's NoSQL database paired with Confluent Platform.
This brief describes a solution with Neo4js graph database and Confluent Platform.
This brief describes a modern data architecture with Kafka and MongoDB
This brief describes streaming data analysis and visualization accelerated by Kinetica's GPU in-memory technology, in partnership with Confluent.
This brief describes an end-to-end streaming analytics solution with Imply, Druid providing the data querying and visualizations and Kafka data streaming.
This brief describes a solution for data integration and replication in real time and continuously into Kafka, in partnership with HVR and Confluent.
This brief describes a modern datacenter to manage the velocity and variety of data with an event driven enterprise architecture with DataStax and Confluentj
This brief describes how to enable operational data flows with NoSQL and Kafka, in partnership with Couchbase and Confluent.
This brief describes a solution to efficiently prepare data streams for Kafka and Confluent with Qlik Data Integration for CDC Streaming.
This brief describes a comprehensive streaming analytics platform for visualizing real-time data with Altiar Panopticon and Confluent Platform.
Dive into full Kafka examples, with connector configurations and Kafka Streams code, that demonstrate different data formats and SerDes combinations for building event streaming pipelines.
Get the presentations from the Kafka Summit San Francisco 2019 event.
Ensure that only authorized clients have appropriate access to system resources by using RBAC with Kafka Connect.
This paper will guide developers who want to build an integration or connector and outlines the criteria used for Confluent to verify the integration.
With its ViZixⓇ item chain management platform, Mojix is helping major retailers store, analyze and act on inventory data collected from IoT sensor streams in real time.
Download this bridge-to-cloud deployment guide for designing, configuring and managing streaming applications in Confluent Cloud.
Learn why organizations are considering Apache Kafka to streamline cloud migrations.
Alight Solutions, recently embarked on an initiative to align the company’s internal organization with its next-generation digital strategy.
Get the presentations from the Kafka Summit San Francisco 2018 event.
Hans Jespersen (VP WW Systems Engineering, Confluent) Opened afternoon presentations: Confluent Cloud: Agility for the modern data-driven enterprise at Confluent’s streaming event in Paris.
In this talk Gwen Shapira will break through the clutter and look at how successful companies are adopting centralized streaming platforms, and the use-cases and methodologies that we see practiced right now.
Use cases for streaming platforms vary from improving the customer experience - we have synthesized some common themes of streaming maturity and have identified five stages of adoption
Get key research stats on why CIOs are turning to streaming data for a competitive advantage.
Download this Forrester study to understand the economic benefits of Confluent Platform. Learn how you can reduce DevOps costs by $2.4M and accelerate business enablement by $3.8M.
In this paper, we introduce the Dual Streaming Model. The model presents the result of an operator as a stream of successive updates, which induces a duality of results and streams.
In this talk, we’ll explain the architectural reasoning for Apache Kafka® and the benefits of real-time integration, and we’ll build a streaming data pipeline using nothing but our bare hands, Kafka Connect and KSQL.
The reference architecture provides a detailed architecture for deploying Confluent Platform on Kubernetes and uses the Helm Charts for Confluent Platform as a reference to illustrate configuration and deployment practices.
Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content—all through the power of a real-time streaming platform.
Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent, will help you decide if running Kafka on Kubernetes is the right approach for your organization.
The survey of the Apache Kafka community shows how and why companies are adopting streaming platforms to build event-driven architectures.
Download the deployment guide to designing, configuring, and managing stream processing in Confluent Cloud with KSQL.
In this white paper, we offer recommendations and best practices for designing data architectures that will work well with Confluent Cloud.
Learn Kubernetes terms, concepts and considerations, as well as best practices for deploying Apache Kafka on Kubernetes.
Originally presented by Gwen Shapira at Gluecon 2018, this talk covers the similarities and differences between the communication layer provided by a service mesh and Apache Kafka and their implementations, as well as ways you can combine them together.
Read this white paper to learn about the common use cases Confluent is seeing amongst its financial services customers.
Get an introduction to and demo of KSQL, Streaming SQL for Apache Kafka.
This video offers an introduction to Kafka stream processing, with a focus on KSQL.
Learn how service-based architectures and stream processing tools such as Apache Kafka can help you build business-critical systems.
Learn about the impact of Confluent and Apache Kafka® on Funding Circle’s lending marketplace, from Kafka Connect to Exactly-Once processing.
HomeAway, the world’s leading online marketplace for the vacation rental industry, uses Apache Kafka® and Confluent to match travelers with 2 million+ unique places to stay in 190 countries.
One of the largest banks in the world—with 16 million clients globally—RBC built a real-time, scalable and event-driven data architecture for their rapidly growing number of cloud, machine learning and AI initiatives.
In this white paper, you will learn how you can monitor your Apache Kafka deployments like a pro, the 7 common questions you'll need to answer, what requirements to look for in a monitoring solution and key advantages of the Confluent Control Center.
Learn about typical Apache Kafka use cases and how organisations can process large quantities of data in real time using the Kafka Streams API and KSQL.
This paper provides 10 principles for streaming services, a list of items to be mindful of when designing and building a microservices system
Kafka has a set of new features supporting idempotence and transactional writes that support building real-time applications with exactly-once semantics. This talk provides an overview of these features.
In this talk, get a short introduction to common approaches and architectures (lambda, kappa) for streaming processing and learn how to use open-source steam processing tools (Flink, Kafka Streams, Hazelcast Jet) for stream processing.
In this talk, we’ll review the breadth of Apache Kafka as a streaming data platform, including, its internal architecture and its approach to pub/sub messaging.
In this talk we'll examine how Stateful Stream Processing can be used to build Event Driven Services, using a distributed log like Apache Kafka. In doing so this Data-Dichotomy is balanced with an architecture that exhibits demonstrably better scaling properties, be it increased complexity, team size, data volume or velocity.
Get the presentations from the Kafka Summit San Francisco 2017 event.
A practical guide to configuring multiple Apache Kafka clusters so that if a disaster scenario strikes, you have a plan for failover, failback, and ultimately successful recovery.
What is microservices? And how does it work in the Apache Kafka ecosystem.
In this video, Tim Berglund explains how you can speed up development with the Confluent Command Line Interface (CLI), which allows you to quickly iterate while implementing your applications and enables you to interact with the Confluent ecosystem.
In this talk, I'll describe some of the design tradeoffs when building microservices, and how Apache Kafka's powerful abstractions can help.
Recording from QCon New York 2017 Gwen Shapira discusses patterns of schema design, schema storage and schema evolution that help development teams build better contracts through better collaboration - and deliver resilient applications faster.
Confluent Cloud is the industry's only cloud-native, fully managed event streaming platform powered by Apache Kafka.
Learn how to take full advantage of Apache Kafka®, the distributed, publish-subscribe queue for handling real-time data feeds.
Businesses are discovering that they can create new business opportunities as well as make their existing operations more efficient using real-time data at scale. Learn how real-time data streams is revolutionizing your business.
Get the presentations from the Kafka Summit San Francisco 2016 event.
This whitepaper discusses how to optimize your Apache Kafka deployment for various services goals including throughput, latency, durability and availability. It is intended for Kafka administrators and developers planning to deploy Kafka in production.
This survey focuses on why and how companies are using Apache Kafka and streaming data and the impact it has on their business.
In this three-day hands-on course you will learn how to build an application that can publish data to, and subscribe to data from, an Apache Kafka cluster.
In this three-day hands-on course, you will learn how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka experts.
In this talk, Matt Howlett will give a technical overview of Kafka, discuss some typical use cases (from surge pricing to fraud detection to web analytics) and show you how to use Kafka from within your C#/.NET applications.
Presentation from Apache Kafka Meetup at Strata San Jose (3/14/17). Jay Kreps will introduce Kafka and explain why it has become the de facto standard for streaming data.
This white paper provides a brief overview of how microservices can be built in the Apache Kafka ecosystem.
Michael Noll provides an introduction to stream processing, use cases, and Apache Kafka.
Jay Kreps, CEO of Confluent and co-creator of Apache Kafka, shows how logs work in distributed systems, and provides practical applications of these concepts.
This white paper outlines the integration of Confluent Enterprise with the Microsoft Azure Cloud Platform.
Best practices for developing a connector using Kafka Connect APIs.
In this paper, we explore some of the fundamental concepts of Apache Kafka, the foundation of Confluent Platform, and compare it to traditional message-oriented middleware.
Learn about the components of Confluent Enterprise, key considerations for production deployments, and guidelines for selecting hardware or deployment with different cloud providers.
In this book, O’Reilly author Martin Kleppmann shows you how stream processing can make your data processing systems more flexible and less complex.
Neha Narkhede explains how Apache Kafka was designed to support capturing and processing distributed data streams by building up the basic primitives needed for a stream processing system.