This module demonstrates the following:
- The usage of the Processor API, including
process()
,addStateStore()
and.schedule()
. - The processor context and the scheduling of tasks based on stream time.
- The creation and cleanup of a key-value store.
- Unit testing using Topology Test Driver.
In this module, records of type <String, KafkaPerson>
are streamed from a topic named PERSON_TOPIC
.
The following tasks are performed:
- Processes the stream using a custom processor that performs the following tasks:
- Pushes new events into the store.
- Deletes events if they are tombstones.
- Send tombstones for each key of the store every minute, based on the stream time.
To compile and run this demo, you will need the following:
- Java 21
- Maven
- Docker
To run the application manually, please follow the steps below:
- Start a Confluent Platform in a Docker environment.
- Produce records of type
<String, KafkaPerson>
to a topic namedPERSON_TOPIC
. You can use the producer person to do this. - Start the Kafka Streams.
To run the application in Docker, please use the following command:
docker-compose up -d
This command will start the following services in Docker:
- 1 Kafka broker KRaft
- 1 Schema registry
- 1 Control Center
- 1 producer Person
- 1 Kafka Streams Store Cleanup