jackdaw

https://github.com/FundingCircle/jackdaw
dakra 2020-09-24T07:13:52.000200Z

ok. thanks for the clarification @noisesmith. Hope they pick up development of jackdaw soon. Quite a few PRs seem ready and just waiting for the merge button to be pressed 😉

bringe 2020-09-24T20:05:52.002600Z

Hi, I'm trying to set auto.register.schemas to false for my kafka producer, but I'm having trouble figuring out where that setting is supposed to go. If I set it in my producer config, I get this warning in the logs:

WARN org.apache.kafka.clients.producer.ProducerConfig - The configuration 'auto.register.schemas' was supplied but isn't a known config.

bringe 2020-09-24T20:06:10.003Z

And I verified that schemas are being auto registered in this case

gklijs 2020-09-24T20:07:17.003500Z

It’s the correct place, the notion is just annoying/misleading.

bringe 2020-09-24T20:07:33.003800Z

This is my config

(def producer-config
  {ProducerConfig/BOOTSTRAP_SERVERS_CONFIG "localhost:9092"
   ProducerConfig/ENABLE_IDEMPOTENCE_CONFIG true
   ProducerConfig/CLIENT_ID_CONFIG "commander"
   AbstractKafkaAvroSerDeConfig/AUTO_REGISTER_SCHEMAS false})
Which is passed to jackdaw.client/producer

bringe 2020-09-24T20:08:38.003900Z

But I checked that the schemas are in fact being auto registered. I made a new topic with no schema set and produced to it with my producer and it registered the schema, which I want to not happen.

bringe 2020-09-24T20:08:51.004100Z

So it seems this config value is ignored

gklijs 2020-09-24T20:09:15.004300Z

You might also need to set AbstractKafkaSchemaSerDeConfig.USE_LATEST_VERSION to true. Which is a bit weird, but how they implemented it.

bringe 2020-09-24T20:09:30.004500Z

Ohh right, okay

gklijs 2020-09-24T20:11:36.004700Z

It doesn’t make much sense to me, because, of course I want to use the latest available schema if I’m not registering one, like which other one should be used.. Spend a lot of time before finding that one out..

gklijs 2020-09-24T20:15:30.004900Z

Is jackdaw by default using the Avro serializers? At least with the base java client you also need ProducerConfig/KEY_SERIALIZER_CLASS_CONFIG ProducerConfig/VALUE_SERIALIZER_CLASS_CONFIG and AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG

bringe 2020-09-24T20:27:39.005200Z

Yeah, seems like this should be handled if auto register is false

bringe 2020-09-24T20:28:31.005400Z

I'm not sure, but I use serde-resolver like this

(def resolve-serde (serde-resolver :schema-registry-url "<http://localhost:8081>"))

(defstate test-producer
  "For testing sending data using an avro serde."
  :start (jc/producer producer-config
                      {:key-serde (string-serde)
                       :value-serde (resolve-serde
                                     {:serde-keyword :jackdaw.serdes.avro.confluent/serde
                                      :key? false
                                      :schema-filename "test-value-v1.json"})})
  :stop (.close test-producer))

bringe 2020-09-24T20:28:37.005600Z

And that seems to work

bringe 2020-09-24T20:36:53.005800Z

So, I see that static field you mentioned, AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG, but when I try to import it, it sayss does not exist o_O

bringe 2020-09-24T20:37:02.006Z

(:import [org.apache.kafka.clients.producer ProducerConfig]
           [io.confluent.kafka.serializers AbstractKafkaAvroSerDeConfig AbstractKafkaSchemaSerDeConfig])

bringe 2020-09-24T20:37:10.006200Z

java.lang.ClassNotFoundException: io.confluent.kafka.serializers.AbstractKafkaSchemaSerDeConfig

bringe 2020-09-24T20:37:49.006400Z

I'm on confluent platform 5.5.1 :thinking_face:

bringe 2020-09-24T20:38:03.006600Z

I can just use the string, but that's odd

gklijs 2020-09-24T20:46:04.006800Z

Example is using confluent platform 5.5.1, so maybe you need to check if the resolved dependency is really 5.5.1.. And 6.0.0 could be released any day now, or when it’s done.

1👍