Is the Avro SpecificRecord (i.e. the generated java classes) compatible with schema evolution? I.e. if I have a source of Avro messages (in my case, kafka) and I want to deserialize those messages to a specificrecord, is it possible to do safely?
What I see:
Even if the messages are compatible, this is a problem.
If I can find the new schema (using e.g. confluent schema registry) I can deserialize to GenericRecord, but there doesn't seem to be a way to map from genericrecord to specificrecord of different schema..
MySpecificType message = (T SpecificData.get().deepCopy(MySpecificType.SCHEMA$, genericMessage);
Deepcopy is mentioned in various places but it uses index so doesn't work..
Is there any safe way to map between two avro objects when you have both schemas and they are compatible? Even if I could map from genercrecord to genericrecord this would do as I could then do the deepcopy trick to complete the job.
There are example tests here for specific data type conversion. Its all in the configuration 'specificDeserializerProps'
I added the following config and got the specific type out as wanted.
HashMap<String, String> specificDeserializerProps = new HashMap<String, String>();
specificDeserializerProps.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "bogus");
specificDeserializerProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
specificAvroDeserializer = new KafkaAvroDeserializer(schemaRegistry, specificDeserializerProps);
Hope that helps