Stream Processing

Our partner provides the industry’s only enterprise-ready Event Streaming Platform, driving a new paradigm for application and data infrastructure. This platform provides a single platform for real-time and historical events, enabling you to build an entirely new category of event-driven applications and gain a universal event pipeline.

Making it easy to build real-time data pipelines and streaming applications, we integrate data from multiple sources and locations into a single, central Event Streaming Platform for your company. Confluent Platform lets you focus on how to derive business value from your data rather than worrying about the underlying mechanics such as how data is being transported or massaged between various systems. Specifically, the platform simplifies connecting data sources to Kafka, building applications with Kafka, as well as securing, monitoring, and managing your Kafka infrastructure.

Kafka Java Client APIs

  • Producer API is a Java Client that allows an application to publish a stream records to one or more Kafka topics.
  • Consumer API is a Java Client that allows an application to subscribe to one or more topics and process the stream of records produced to them.
  • Streams API allows applications to act as a stream processor, consuming an input stream from one or more topics and producing an output stream to one or more output topics, effectively transforming the input streams to output streams. It has a very low barrier to entry, easy operationalization, and a high-level DSL for writing stream processing applications. As such it is the most convenient yet scalable option to process and analyze data that is backed by Kafka.
  • Connect API is a component that you can use to stream data between Kafka and other data systems in a scalable and reliable way. It makes it simple to configure connectors to move data into and out of Kafka. Kafka Connect can ingest entire databases or collect metrics from all your application servers into Kafka topics, making the data available for stream processing. Connectors can also deliver data from Kafka topics into secondary indexes like Elasticsearch or into batch systems such as Hadoop for offline analysis.
Menu