我想处理带有topic1的使用者的数据,然后将消息发送回Kafka并发送到topic2
Kafka --> Consumer (processing messages) from topic1, then call a Producer to send processed message to topic2 --> Kafka
我的尝试:
consumer.on('message', (message) => {
let processedMsg = processMessage(message);
payloads = [
{ topic: 'topic2', messages: processedMsg }
];
producer.on('ready', function () {
producer.send(payloads, function (err, data) {
console.log(data);
});
});
producer.on('error', function (err) {})
});
但是,生产者无法将已处理的消息发送到Kafka。我得到的错误
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added. Use emitter.setMaxListeners() to increase limit
我使用节点模块Kafka-node
答案 0 :(得分:1)
您需要切换生产者就绪侦听器和消费者消息侦听器的顺序。
否则,您将为每条消耗的消息设置就绪的侦听器
例如
producer.on('ready', function () {
consumer.on('message', (message) => {
let processedMsg = processMessage(message);
payloads = [
{ topic: 'topic2', messages: processedMsg }
];
producer.send(payloads, function (err, data) {
console.log(data);
});
});
不过,如果主要是处理并转发到新主题https://github.com/nodefluent/kafka-streams/
,我建议您看一下这个库