- Newest
- Most votes
- Most comments
Greetings from AWS !
It seems you're AWS S3 Sink Connector with byte array deserializer to consume messages from Kafka, but these messages aren't directly convertible to JSON due to their format containing Unicode escape sequences. Your goal is to print these messages in a human-readable key-value JSON format.
To provide the most effective solution, I'd appreciate some additional details:
- Please share sample message from the Kafka topic using the kafka-console-consumer.sh tool. Refer : https://www.conduktor.io/kafka/kafka-consumer-cli-tutorial/
- What is the source or client library generating messages into Kafka topic ? Can you clarify the original schema of the message (e.g., Avro, JSON, custom format)?
- If there's a schema used when inserting messages into the topic, please describe its structure in plain text.
- Is there any transformation or conversion applied to the message before it's inserted into Kafka (e.g., using schema converters)? We'd like to understand the message's original format, its insertion process, and the desired S3 format.
- Could you share the connector configuration details of the Kafka connector and worker configuration used on your end which resulted in above message format ?
NOTE : When sharing the information, please ensure that no sensitive data such as passwords, account IDs, or credentials is disclosed.
For more comprehensive assistance, consider contacting AWS Premium Support directly through the AWS Support Center. This option is available if you have an active support plan or would like to purchase one.
[+] Confluent Amazon S3 Sink Connector - https://docs.confluent.io/kafka-connectors/s3-sink/current/overview.html
[+] Lenses AWS S3 Sink Connector - https://docs.lenses.io/5.0/integrations/connectors/stream-reactor/sinks/s3sinkconnector/
Relevant content
- asked 3 years ago
- asked 3 years ago
- AWS OFFICIALUpdated a year ago
