InvalidInput.InvalidConnectorConfiguration Error on AWS MSK Sink Connector

0

What I'm trying to do: I want to run the Python file in EC2 to send data to MSK, and I want to save the data in MSK in Parquet format in S3 through the Sink connector.

Problem: An error occurs in the key.converter=com.amazonaws.services.semaregistry.kafkaconnect.AWSKafkaAvroConverter part when creating MSK SinkConverter on the AWS console.

ErrorCode : InvalidInput.InvalidConnectorConfiguration Messeage : The connector configuration is invalid. Message: Connector configuration is invalid and contains the following 1 error(s): Invalid value com.amazonaws.services.schemaregistry.kafkaconnect.AWSKafkaAvroConverter for configuration value.converter: Class com.amazonaws.services.schemaregistry.kafkaconnect.AWSKafkaAvroConverter could not be found.

connector.class=io.confluent.connect.s3.S3SinkConnector value.converter.schemaAutoRegistrationEnabled=true s3.region=ap-northeast-2 flush.size=1 tasks.max=1 topics=topicname key.converter.region=ap-northeast-2 value.converter.registry.name=registryname key.converter.schemaName=schemaname value.converter.avroRecordType=GENERIC_RECORD value.converter.region=ap-northeast-2 key.converter.registry.name=registryname format.class=io.confluent.connect.s3.format.parquet.ParquetFormat value.converter.schemaName=schemaname key.converter.schemaAutoRegistrationEnabled=true partitioner.class=io.confluent.connect.storage.partitioner.DefaultPartitioner key.converter.avroRecordType=GENERIC_RECORD value.converter=com.amazonaws.services.schemaregistry.kafkaconnect.AWSKafkaAvroConverter storage.class=io.confluent.connect.s3.storage.S3Storage s3.bucket.name=*** key.converter=org.apache.kafka.connect.storage.StringConverter

AWS MSK Cluster Version: 3.5.1 I think that the cluster version doesn't support it, what's the solution? The same problem occurred with the maximum cluster version.

kwon
已提問 5 個月前檢視次數 339 次
沒有答案

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南