Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS MSK S3 Sink Connector to Deserialize AVRO without Schema Registry #708

Open
GauravBhandariTSB opened this issue Dec 27, 2023 · 2 comments

Comments

@GauravBhandariTSB
Copy link

Referring to

The Kafka Topic has AVRO Flume Event. I understand that the event in the topic has both the Schema as well as Payload. I have provided below Kafka Connect Sink Properties

"connector_configuration": {
              "connector.class": "io.confluent.connect.s3.S3SinkConnector",
              "flush.size": "15",
              "format.class": "io.confluent.connect.s3.format.avro.AvroFormat",
              "key.converter": "org.apache.kafka.connect.storage.StringConverter",
              "locale": "en-US",
              "partition.duration.ms": "86400000",
              "partitioner.class": "io.confluent.connect.storage.partitioner.DailyPartitioner",
              "path.format": "'year'=YYYY/'month'=MM/'day'=dd",
              "rotate.interval.ms": "900000",
              "s3.bucket.name": "s3-destination",
              "s3.region": "eu-west-2",
              "schema.compatibility": "NONE",
              "storage.class": "io.confluent.connect.s3.storage.S3Storage",
              "tasks.max": "1",
              "timestamp.extractor": "Record",
              "timezone": "UTC",
              "topics": "XXXXX",
              "topics.dir": "topic",
            },

The Output avro file also gets created but the problem is that the application is not able to parse the AVRO file and getting error. I checked the output avro file in S3 bucket and tried to parse myself and getting error.

image

As seen from the image, the avro.schema shows NULL and byte. I am using Confluent Kafka S3 Sink Connector and created AWS Kafka Plugin and then referred that plugin in AWS MSK Connect.

Can someone please help and let me know what exactly I am doing wrong ?

@OneCricketeer
Copy link

AvroFormat only requires a Struct Connect type with some Schema...

You need to define your own value.converter class, not use the default you've set. The Registry isn't be required to store Avro data, e.g. I've tested storing Avro from JSONConverter with schemas enabled

@raphaelauv
Copy link

check for avro ser/deser projects , this is one -> https://github.com/farmdawgnation/registryless-avro-converter

I don't know if it work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants