Building messaging solutions with Apache Kafka or IBM Event Streams for IBM Cloud

This multi-part blog series is going to walk you through some of the key architectural considerations and steps for building messaging solutions with Apache Kafka or IBM Event Streams for IBM Cloud. This series will be helpful for developers, architects, and technology consultants who have a general understanding of Apache Kafka and are now looking toward getting deeper into evaluating and building messaging solutions.

Background

Once you start evaluating or working with Apache Kafka, you soon realize that behind its simplicity are numerous configuration “knobs and levers” for controlling how messages are handled. These configuration options provide a lot of flexibility to make Kafka fit for multiple use cases, but at the same time, poor understanding and implementation of these options can increase risks of data loss and data inconsistencies. It is important that the handling of various functional aspects of the application is well understood and tested, and it is equally important that the various operational and administrative interdependencies related to the application are proactively incorporated in the architecture. In short, the sooner you understand the behavior, consequences, and interplay of the configurations, the sooner you can mitigate risks, reduce operational overhead, and build a resilient, fault-tolerant solution for production.

Goals for this blog series

With the above in mind, the goals for this blog series are as follows:

  1. Provide the fundamental building blocks: Architectural considerations and best practices for a production-grade, robust solution with Kafka.
  2. Provide an approach: Iterative steps on how to start building a solution—gathering information, evaluating and designing, and tips for proactively managing and mitigating risks.

Explore benefits of managed Kafka

Learning and target audience

Like any technology, the concepts related to Kafka are wide and deep. There is plenty of good material already on the internet that can provide an overview of Kafka’s internal architecture and concepts. If you are new to Kafka, the best way to get started and familiarize yourself with the core concepts is by leveraging the official Kafka content.

Once you have a general understanding of Kafka and are looking toward getting deeper into evaluating and building solutions, this blog can serve as a good starting reference template. Using examples and an iterative approach, the blog walks through the requirements-gathering and design process. This can help developers, architects, and consultants get a structured start for designing a solution for their use case. My comments may not be exhaustive or applicable to all use cases and situations, but I have attempted to mention related options and scenarios where possible. Any feedback is welcome.

A (very) brief introduction to Apache Kafka and Event Streams for IBM Cloud

At a high level, the main components in any Kafka solution include Producers and Consumers that interact with Kafka. Producers are part of your own code leveraging Kafka libraries. They create and write messages to Kafka Topics.

Message records get written to a Topic Partition in the order they are received and automatically get a position number or offset assigned. Consumers are part of your client code, leveraging Kafka libraries to read and process messages from the Topic.

While Apache Kafka is a great platform, it is also a distributed platform. That means you need to routinely manage the infrastructure of all the distributed components—servers, replicas, and their configurations—in addition to the application logic in your Kafka Producer/Consumer code components.

Fortunately, Event Streams—available on IBM Cloud—takes many of the complexities away from the end user by providing Apache Kafka as a Service. With less to worry about in managing Apache Kafka cluster and its operations, Event Streams for IBM Cloud enables the end-user developers and architects to directly focus on the value from Kafka and design resilient and fault-tolerant solutions on IBM Cloud. It’s quick and easy to set up your own Kafka service and get some hands-on experience. Check out the getting started page for Event Streams for IBM Cloud.

Let’s get to the main goals of this blog post.

The fundamental building blocks

#awvi,#hybridcloud,#IBMCloud

via IBM Cloud Blog https://ibm.co/2pQcNaA

October 17, 2018 at 11:21AM