Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More
Rich Pre-built ecosystem

Enable fast, broad and reliable Apache Kafka® connectivity

Confluent Platform offers a rich pre-built ecosystem of over 100 Kafka connectors and a schema registry to rapidly and reliably build event streaming applications around Kafka.

Fonctionnalités

Pre-built Kafka Connectors

Confluent develops and works with partners who develop enterprise-ready connectors based on the Kafka Connect framework. Connectors are supported by either Confluent or our partners. A portion of them are available as managed connectors with Confluent Cloud.

Confluent Hub

Confluent Hub is an online marketplace to easily browse, search, and filter for connectors and other plugins that best fit your data movement needs for Kafka.

Schema Registry

Schema Registry is a central repository with a RESTful interface for developers to define standard schemas and register applications to enable compatibility. Schema Registry is available as a software component of Confluent Platform or as a managed component of Confluent Cloud.

MQTT Proxy

MQTT Proxy delivers Kafka-native connectivity into IoT devices without the need for intermediate MQTT brokers, eliminating the additional cost and lag. MQTT Proxy accesses, combines and guarantees that IoT data flows into Kafka without adding additional layers of complexity.

Enable application development compatibility

Develop using standard schemas

Store and share a versioned history of all standard schemas, and validate data compatibility at the client level. Schema Registry supports Avro, JSON and Protobuf serialization formats.

Reduce operational complexity

Schema Registry reduces operational complexity in the application development cycle, because it eliminates the need for complex coordination among developers. Need to add a new column to a downstream database? You don’t need an involved change process and at least 5 meetings to coordinate 20 teams.

Deploy confidently in production

Scale Kafka schemas reliably

Schema Validation delivers a programmatic way of validating and enforcing Schema Registry schemas directly on the Kafka broker and with topic-level granularity. It provides greater control over data quality, which increases the reliability of the entire Kafka ecosystem.

Manage the Kafka ecosystem centrally

Simplify management for production environments using Control Center as the GUI.

  • Manage multiple connectors: add, edit and delete connectors across multiple Connect clusters
  • Manage every schema: create, edit and view topic schemas, compare versions, and enable Schema Validation when creating new topics.

Ressources