Event Driven Architecture using Spring Cloud and Kafka broker

Anas BENSTITOU
4 min readMar 30, 2022

--

Hello everyone, in this tutorial we will implement an event driven architecture using spring boot, spring cloud and kafka broker. Enjoy !

The goal of this tutorial is to implement the event driven architecture using one producer and one consumer with a single topic.

The final Architecture

Prerequisites

  • Java 8 or greater (I’m using java 17 for this tutorial)
  • Maven 2 or greater
  • Docker (for docker compose)

Spring Cloud Stream

Is the solution from the ecosystem Spring to build applications connected to a shared messaging systems like Kafka, RabbitMQ, Azure Event Hubs and more.

Spring Cloud Stream offers the binding abstraction to the messaging system and works the same way whatever the binder that we use. So you can have an application based on Kafka broker as messaging system and, without any change of code, you can use another messaging system like RabbitMQ or even move to a cloud messaging service.

To use Spring Cloud Stream Kafka, you need to add the dependency to your maven project:

<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId> </dependency>

Or :

<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-kafka</artifactId>
</dependency>

I — Create the Producer

First, we need to create a new project to produce messages to the topic.

Then, we will add a new controller with a simple POST API to trigger the produce process :

The controller uses a service to create a new Message object and calls the integration service to send the message to Kafka :

Then, the message integration service, creates a new Event object and uses the StreamBridge to send the message to the topic :

And finally, we need to configure the producer to connect to a Kafka broker :

Note that I’m using two profiles docker and kafka :

  • docker : to run my application as a docker container
  • kafka: to add more configs appropriate to Kafka broker, you can add more profiles if you want to use other messaging systems.

The final step is to Dockerize the application, add the Dockerfile to your project with the following content :

II — Create the Consumer

Create a new project to consume the messages from the Kafka topic.

And add the following code to your application :

The configurations for the consumer app is pretty similar to the producer :

You can use the same Dockerfile used for the producer to Dockerize the consumer app.

III — Run them all

To run the application we need a Kafka broker, so for that we can use a Docker image to run it locally, you can use a Docker Compose to run all the Docker images that you need.

Create a new docker-compose.yml file with the following content :

In this file we have five services :

  • producer : built from the Dockerfile of the producer application
  • consumer : built from the Dockerfile of the consumer application
  • zookeeper : a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services (https://zookeeper.apache.org/)
  • kafka : the message broker that we use for this tutorial
  • kafka-ui : a service to visualise and connect to our Kafka broker

Run the following command to start all the services :

$ docker-compose -p demo -f docker-compose.yml up -d

You can check for the status of all the services by using Docker CLI or from the Docker Desktop :

All the services running on Docker Desktop

IV — Testing the application

Finally, you can test the application by using Curl or using Postman, you can POST a new message using the url : http://127.0.0.1:8080/produce

Test using Postman

You can also check the logs of the producer and the consumer.

You should have a log similar to the following for the producer :

And a log similar to the following for the consumer :

You can access to the Kafka UI from the URL : http://localhost:8088/ui

And check the messages topic :

kafka-ui dashboard

V— Stop them all

Run the following command to stop all the services :

$ docker-compose -p demo -f docker-compose.yml down

--

--

No responses yet