Deploying a Spring Boot app as a Kafka Producer on Kubernetes - dpsp-summit/wiki GitHub Wiki

In this tutorial we are going to use an Spring Boot app for sending messages to a Kafka Server created on an Aiven Account and deploy the app to a Kubernetes Cluster.

Requirements

  • An account on the Aiven Console
  • Kubernetes cluster

Create a Kafka server on Aiven

Create the service

On the Aiven Console, click on the "+ Create a new service" button. As the service we will select "Kafka 2.2" and the service name will be "kafka-poc"

Download the CA Certificate

Once the service is created, go to the Overview tab and download the CA Certificate: image Also, the Service URI will be used by our application as the endpoint for sending our messages.

Download Client Key and Cert

Now, go to the "Users" tab. Create a username and click on "Add service user". Once our client is created, download the access key and access cert on our computer. kafka users edit kafka producer edit

Create Keystore and Truststore

At this point we have three files: service.key, service.cert and ca.pem. Using those files we will generate two more files: client.keystore.p12 and client.truststore.jks.

Run the next commands:

openssl pkcs12 -export -inkey service.key -in service.cert -out client.keystore.p12 -name service_key
keytool -import -file ca.pem -alias CA -keystore client.truststore.jks

Both commands will ask for a password, be sure to save the passwords as they will be required for our application.

Create Topic

Finally, we go to the "Topics" tab for creating our "poc-products" topic. image

Create the Kafka producer app

Add Spring Kafka

In order to send messages to the Kafka server, we need to import the Spring Kafka lib in our project. Add the spring kafka dependency on the pom.xml:

<dependency>  
 <groupId>org.springframework.kafka</groupId>  
 <artifactId>spring-kafka</artifactId>  
</dependency>

Also, we have to add the properties to connect to the Kafka server on the application.yaml:

spring.kafka:  
  bootstrapServers: ${KAFKA_SERVER}  
  producer:  
    properties:  
      security.protocol: SSL  
      ssl:  
        endpoint.identification.algorithm: ""  
  key.password: ${KEY_PASSWORD}  
        truststore:  
          location: ${TRUSTSTORE_PATH}  
          password: ${TRUSTSTORE_PASSWORD}  
        keystore:  
          type: PKCS12  
          location: ${KEYSTORE_PATH}  
          password: ${KEYSTORE_PASSWORD}

Note:
The KAFKA_SERVER variable is the Service URI that we got from creating our Kafka Service on the Aiven Console.

And now we can use the Kafka template to send messages to the topic:

@RestController  
@EnableAsync  
public class ProductController {
  private KafkaTemplate<String, String> template;

  @PostMapping("/products")  
  public ProductModel postProduct(@RequestBody ProductModel product) {  
    product = repository.save(product);  
  
    template.send("poc-products", product.getName());  
    return product;  
  }
}

Configure the Kubernetes cluster

Creating a Secret on Kubernetes

We need to add our truststore and keystore to the kubernetes cluster

kubectl create secret generic kafka-certs --from-file=./client.truststore.jks --from-file=./client.keystore.p12

Deploy the application

This time, for our deployment configuration we use the secret from the last step, and we use it to place the files into the "/certs" path on the application:

kind: Deployment
apiVersion: apps/v1
metadata:
  name: kafka-test
  namespace: default
spec:
  replicas: 1
  selector:
    matchLabels:
      app: kafka-test
  template:
    metadata:
      labels:
        app: kafka-test
    spec:
      containers:
      - name: kafka-test
        image: fbereche/kafka-test:1.0.2
        ports:
        - containerPort: 8080
        volumeMounts:
        - name: certs
          mountPath: "/certs"
      volumes:
      - name: certs
        secret:
          secretName: kafka-certs
⚠️ **GitHub.com Fallback** ⚠️