pl.touk.nussknacker.engine.avro.schemaregistry.confluent.formatter
Step 1: Deserialize raw kafka event to GenericRecord/SpecificRecord domain.
Step 1: Deserialize raw kafka event to GenericRecord/SpecificRecord domain. Step 2: Create Encoders that use ConfluentAvroMessageFormatter to convert avro object to json Step 3: Encode event's data with schema id's with derived encoder.
Step 1: Deserialize raw json bytes to AvroSerializableConsumerRecord[Json, Json] domain without interpreting key and value content.
Step 1: Deserialize raw json bytes to AvroSerializableConsumerRecord[Json, Json] domain without interpreting key and value content. Step 2: Create key and value json-to-avro interpreter based on schema id's provided in json. Step 3: Use interpreter to create raw kafka ConsumerRecord
(Since version ) see corresponding Javadoc for more information.
Formatter uses writer schema ids to assure test data represent raw events data, without schema evolution (which adjusts data to reader schema). Test data record contains data of ConsumerRecord and contains key and value schema ids (see [AvroSerializableConsumerRecord]).
- key type passed from KafkaAvroSourceFactory, used to determine which datumReaderWriter use (e.g. specific or generic)
- value type passed from KafkaAvroSourceFactory, used to determine which datumReaderWriter use (e.g. specific or generic)