Apache Kafka

Detailed documentation on the Apache Kafka pubsub component

Component format

To setup Apache Kafka pubsub create a component of type pubsub.kafka. See this guide on how to create and apply a pubsub configuration.

apiVersion: dapr.io/v1alpha1
kind: Component
  name: kafka-pubsub
  namespace: default
  type: pubsub.kafka
  version: v1
      # Kafka broker connection setting
    - name: brokers
      value: "dapr-kafka.myapp.svc.cluster.local:9092"
    - name: authRequired
      value: "true"
    - name: saslUsername
      value: "adminuser"
    - name: saslPassword
      value: "KeFg23!"
    - name: maxMessageBytes
      value: 1024

Spec metadata fields

Field Required Details Example
brokers Y Comma separated list of kafka brokers localhost:9092, dapr-kafka.myapp.svc.cluster.local:9092
authRequired N Enable authentication on the Kafka broker. Defaults to "false". "true", "false"
saslUsername N Username used for authentication. Only required if authRequired is set to true. "adminuser"
saslPassword N Password used for authentication. Can be secretKeyRef to use a secret reference. Only required if authRequired is set to true. Can be secretKeyRef to use a secret reference "", "KeFg23!"
maxMessageBytes N The maximum message size allowed for a single Kafka message. Default is 1024. 2048

Per-call metadata fields

Partition Key

When invoking the Kafka pub/sub, its possible to provide an optional partition key by using the metadata query param in the request url.

The param name is partitionKey.


curl -X POST http://localhost:3500/v1.0/publish/myKafka/myTopic?metadata.partitionKey=key1 \
  -H "Content-Type: application/json" \
  -d '{
        "data": {
          "message": "Hi"

Create a Kafka instance

You can run Kafka locally using this Docker image. To run without Docker, see the getting started guide here.

To run Kafka on Kubernetes, you can use any Kafka operator, such as Strimzi.