问题描述
限时送ChatGPT账号..我正在构建一个 Kafka Consumer 应用程序,该应用程序使用来自 Kafka 主题的消息并执行数据库更新任务.消息每天大批量生成一次 - 因此该主题在 10 分钟内加载了大约 100 万条消息.Topic 有 8 个分区.
I am building a Kafka Consumer application that consumes messages from a Kafka Topic and performs a database update task. The messages are produced in a large batch once every day - so the Topic has about 1 million messages loaded in 10 minutes. The Topic has 8 partitions.
Spring Kafka Consumer(使用@KafkaListener 注释并使用 ConcurrentKafkaListenerContainerFactory)在非常短的批次中触发.
The Spring Kafka Consumer (annotated with @KafkaListener and using a ConcurrentKafkaListenerContainerFactory) is triggered in very short batches.
批量大小有时只有 1 或 2 条消息.如果它可以一次消耗大约 1000 条消息并一起处理(例如,我可以在单个更新 SQL 中更新数据库),而不是为每条消息连接到数据库,这将有助于提高性能.
The batch size is sometimes just 1 or 2 messages. It would help performance if it can consume about 1000 messages at once and process it together (for example, I could update the database in a single update SQL), instead of connecting to the database for each message.
我已经尝试降低工厂中的并发性,以避免多个线程消耗较少数量的消息.
I have already tried to decrease the concurrency in the factory to avoid multiple threads consuming smaller number of messages.
我还将 Kafka 的 server.properties 中的 socket.send.buffer.bytes 属性从 102400 增加到 1024000.
I also increased the socket.send.buffer.bytes property in Kafka's server.properties to 1024000, from 102400.
这些步骤没有增加批量大小.
These steps have not increased the batch size.
我可以使用其他任何配置来增加消费者的浴缸尺寸吗?
Is there any other configuration I could use to increase the bath size of the consumer?
推荐答案
查看 kafka 消费者属性 max.poll.records
、fetch.min.bytes
、fetch.max.wait.ms
、fetch.max.bytes
、max.partition.fetch.bytes
.
See kafka consumer properties max.poll.records
, fetch.min.bytes
, fetch.max.wait.ms
, fetch.max.bytes
, max.partition.fetch.bytes
.
很可能 fetch.min.bytes
和 fetch.max.wait.ms
正是您所需要的.
Most likely fetch.min.bytes
and fetch.max.wait.ms
is what you need.
这篇关于如何增加Spring Kafka Consumer每批消费的消息数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
更多推荐
[db:关键词]
发布评论