It took a bit longer than expected due to incorrect asumptions, but eventually we managed to get it working fairly easily. Our solution consists of three parts:
The idea behind our solution is to use as many standard Mendix components as possible and to have a generic implementation that can be reused in any project. Therefore, we create a JSON message in Mendix, using the standard export mappings and we convert these JSON message to AVRO (to a GenericRecord) in our custom Java code. One thing to note here is that AVRO expects explicit null values for empty elements for which no default value is specified in the schema, but Mendix supports this out of the box with the option ‘send empty values’ set to ‘yes, as null’ in the export mapping.
The custom Java code uses three main libraries:
We implemented both an approach which uses the schema registry and an approach which doesn’t use a schema registry. Without a schema registry, you will have to manually supply a schema from the Mendix application to the custom Java code, and you will have to fiddle with magic bytes and schema id’s manually.
We had to modify the Mendix supplied Kafka module, but the changes were fairly straightforward. I duplicated a number of files and modified them to be able to handle avro messages:
The modifications in these files were mainly concerned with a different type: KafkaProducer<String, String> to KafkaProducer<String, byte> kafkaProducer. In the Send* and ConsumerRunner, I implemented the conversion from the JSON message generated in Mendix to the AVRO message generated by our custom Java code. Once you have modified these files, the configuration options provided by the module are enough to make this work.
Hope this helps with your implementation!
Hi @Kranz Bertram,
We are starting a PoC where we’re struggling with similar questions. Have you finished your developments ?
We’re using Confluent on our projects. So would be interisting now if nowadays we have something standard to use.