Project Metamorphosis : dévoilement de la plateforme de streaming d'événements nouvelle générationEn savoir plus

Monitoring Apache Kafka with Confluent Control Center Video Tutorials

Mission critical applications built on Kafka need a robust and Kafka-specific monitoring solution that is performant, scalable, durable, highly available, and secure. Confluent Control Center helps monitor your Kafka deployments and provides assurances that your services are behaving properly and meeting SLAs. Control Center was designed for Kafka, by the creators of Kafka, from Day Zero. It provides the most important information to act on so you can address important business-level questions:

  • Are my brokers up?
  • Are applications receiving all data?
  • Are they up to date with the latest events?
  • Why are the applications running slowly?
  • Do we need to scale out?

You may think Kafka is running fine, but how do you prove it?

Control Center can monitor your Kafka cluster and applications with important monitoring features like System healthEnd-to-end stream monitoring, and Alerting

Watch the introductory Confluent Control Center video Monitoring Kafka like a Pro (3:30).



If you are in the early phase of your exploration of Confluent Control Center, you can learn more about what it can do by watching the Confluent Control Center Overview video series:

If you are ready to get hands on, check out our Confluent Platform Demo GitHub repo. Following the realistic scenario in this repo, which takes just a few seconds to spin up, you will use Control Center to monitor a Kafka cluster and then walk through a playbook of various operational events. The use case is as follows:

A streaming ETL pipeline built around live edits to real Wikipedia pages. The Wikimedia Foundation has IRC channels that publish edits happening to real wiki pages in real time. Using Kafka Connect, a Kafka source connector kafka-connect-irc streams raw messages from these IRC channels, and a custom Kafka Connect transform kafka-connect-transform-wikiedit transforms these messages and then the messages are written to Kafka. This demo uses KSQL for data enrichment, or you can optionally develop and run your own Kafka Streams application. Then a Kafka sink connector kafka-connect-elasticsearch streams the data out of Kafka, and the data is materialized into Elasticsearch for analysis by Kibana.

The cp-demo repo comes with a playbook for operational events and corresponding video tutorials of useful scenarios to run through with Control Center:

Link to Playbook Link to Demo video series 
Installing and running the demo Demo 1: Install + Run | Monitoring Kafka in Confluent Control Center
Tour of Confluent Control Center Demo 2: Tour | Monitoring Kafka in Confluent Control Center
KSQL Demo 3: KSQL | Monitoring Kafka in Confluent Control Center
Consumer rebalances Demo 4: Consumer Rebalances | Monitoring Kafka in Confluent Control Center
Slow consumers Demo 5: Slow Consumers | Monitoring Kafka in Confluent Control Center
Over consumption Demo 6: Over Consumption | Monitoring Kafka in Confluent Control Center
Under consumption Demo 7: Under Consumption | Monitoring Kafka in Confluent Control Center
Failed broker Demo 8: Failed Broker | Monitoring Kafka in Confluent Control Center
Alerting Demo 9: Alerting | Monitoring Kafka in Confluent Control Center

You can also download the Confluent Platform and bring up your own cluster and Confluent Control Center locally with the quickstart.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Measuring and Monitoring a Stream Processing Cloud Service: Inside Confluent Cloud ksqlDB

While preparing for the launch of Confluent Cloud ksqlDB, the ksqlDB Team built a system of metrics and monitoring that enabled insight into the experience of operating ksqlDB, the associated

Build Real-Time Observability Pipelines with Confluent Cloud and AppDynamics

Many organisations rely on commercial or open source monitoring tools to measure the performance and stability of business-critical applications. AppDynamics, Datadog, and Prometheus are widely used commercial and open source

Real-Time Fleet Management Using Confluent Cloud and MongoDB

Most organisations maintain fleets, a collection of vehicles put to use for day-to-day operations. Telcos use a variety of vehicles including cars, vans, and trucks for service, delivery, and maintenance.