- 최신
- 최다 투표
- 가장 많은 댓글
I assume that it is given that the producers need to send their payload in multiple messages, otherwise, combine them and send them as a single message. If the reason for not consolidating is payload size, you could save the payload in S3/DynamoDB/etc and only send a pointer to the data in the queue.
If you still need to send multiple messages, you can use Kafka (or the managed version MSK). You will create a single topic with multiple partitions. You will define the producer ID as the message key which will route all the message for the same producer to the same partition within the topic. You will create a single consumer group which will subscribe to that topic. Within the consumer group each partition will be handled by a single consumer. Note that the consumer may still handle messages for multiple partitions, i.e., producers, so it will need to maintain in memory the relevant data until it receives all messages.
How many producers? How many consumers? How many messages?
100+ producers (This may increase going forward), each producer can have a data rate of 20KB/sec The consumers are something I am trying to optimize, There is no limit on consumers.