Hi, I'm trying to set auto.register.schemas
to false
for my kafka producer, but I'm having trouble figuring out where that setting is supposed to go. If I set it in my producer config, I get this warning in the logs:
WARN org.apache.kafka.clients.producer.ProducerConfig - The configuration 'auto.register.schemas' was supplied but isn't a known config.
And I verified that schemas are being auto registered in this case
It’s the correct place, the notion is just annoying/misleading.
This is my config
(def producer-config
{ProducerConfig/BOOTSTRAP_SERVERS_CONFIG "localhost:9092"
ProducerConfig/ENABLE_IDEMPOTENCE_CONFIG true
ProducerConfig/CLIENT_ID_CONFIG "commander"
AbstractKafkaAvroSerDeConfig/AUTO_REGISTER_SCHEMAS false})
Which is passed to jackdaw.client/producer
But I checked that the schemas are in fact being auto registered. I made a new topic with no schema set and produced to it with my producer and it registered the schema, which I want to not happen.
So it seems this config value is ignored
You might also need to set AbstractKafkaSchemaSerDeConfig.USE_LATEST_VERSION
to true. Which is a bit weird, but how they implemented it.
Ohh right, okay
It doesn’t make much sense to me, because, of course I want to use the latest available schema if I’m not registering one, like which other one should be used.. Spend a lot of time before finding that one out..
Is jackdaw by default using the Avro serializers? At least with the base java client you also need ProducerConfig/KEY_SERIALIZER_CLASS_CONFIG
ProducerConfig/VALUE_SERIALIZER_CLASS_CONFIG
and AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG
Yeah, seems like this should be handled if auto register is false
I'm not sure, but I use serde-resolver like this
(def resolve-serde (serde-resolver :schema-registry-url "<http://localhost:8081>"))
(defstate test-producer
"For testing sending data using an avro serde."
:start (jc/producer producer-config
{:key-serde (string-serde)
:value-serde (resolve-serde
{:serde-keyword :jackdaw.serdes.avro.confluent/serde
:key? false
:schema-filename "test-value-v1.json"})})
:stop (.close test-producer))
And that seems to work
So, I see that static field you mentioned, AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG
, but when I try to import it, it sayss does not exist o_O
(:import [org.apache.kafka.clients.producer ProducerConfig]
[io.confluent.kafka.serializers AbstractKafkaAvroSerDeConfig AbstractKafkaSchemaSerDeConfig])
java.lang.ClassNotFoundException: io.confluent.kafka.serializers.AbstractKafkaSchemaSerDeConfig
I'm on confluent platform 5.5.1 :thinking_face:
I can just use the string, but that's odd
Example is using confluent platform 5.5.1, so maybe you need to check if the resolved dependency is really 5.5.1.. And 6.0.0 could be released any day now, or when it’s done.