如何使用不同的KafkaListeners阅读不同的主题

时间:2019-04-23 09:58:39

标签: spring-boot spring-kafka

我正在尝试使用2个不同的KafkaListeners从2个不同的Kafka主题中读取内容,并且每当调用第二个KafkaListener时都会出现错误。我收到org.springframework.messaging.converter.MessageConversionException错误。似乎第二个侦听器读取了一条特定于第一个侦听器和主题的消息。

我正在听一个Kafka主题。然后将消息内容写入Websocket通道,在此我有一个已订阅该通道的SockJS客户端。这很完美。然后,我创建了一个新主题,然后添加了第二个KafkaListener。但是,当调用secong侦听器时,我看到它正在尝试处理/读取与第一个KafkaListener和主题相对应的有效负载,并且由于未配置为这样做,因此抛出MessageConversionException错误。

Models
------
@JsonPropertyOrder({ "ackdate", "ack_count" })
public class DailyTransfer {
    private String ackdate;
    private Long ack_count;

    public DailyTransfer() {}

    public DailyTransfer(String ackdate, Long ack_count) {
        this.ackdate = ackdate;
        this.ack_count = ack_count;
    }

    ... Getters and Setters omitted for brevity

    @Override
    public String toString() {
        return "DailyTransfer{" +
                "ackdate='" + ackdate + '\'' +
                ", ack_count=" + ack_count +
                '}';
    }
}

@JsonPropertyOrder({ "rgdno", "bizname", "tin", "incordate", "commencedate", "biz_pk", "ack_at", "ack_at_ms", "ack_message" })
public class BizAck {

    private String rgdno;
    private String ack_message;
    private String bizname;
    private String tin;
    private String incordate;
    private String commencedate;
    private Long biz_pk;
    private String ack_at;
    private Long ack_at_ms;

    public BizAck() {}

    public BizAck(String rgdno, String ack_message, String bizname, String tin, String incordate, String commencedate, Long biz_pk, String ack_at,
                    Long ack_at_ms) {
        this.rgdno = rgdno;
        this.ack_message = ack_message;
        this.bizname = bizname;
        this.tin = tin;
        this.incordate = incordate;
        this.commencedate = commencedate;
        this.biz_pk = biz_pk;
        this.ack_at = ack_at;
        this.ack_at_ms = ack_at_ms;
    }

    ... Getters and Setters omitted for brevity

    @Override
    public String toString() {
        return "BizAck{" +
                "rgdno='" + rgdno + '\'' +
                ", ack_message='" + ack_message + '\'' +
                ", bizname='" + bizname + '\'' +
                ", tin='" + tin + '\'' +
                ", incordate='" + incordate + '\'' +
                ", commencedate='" + commencedate + '\'' +
                ", biz_pk=" + biz_pk +
                ", ack_at='" + ack_at + '\'' +
                ", ack_at_ms=" + ack_at_ms +
                '}';
    }
}

Configuration
-------------
@Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> cprops = new HashMap<>();
        cprops.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, env.getProperty("spring.kafka.bootstrap-servers"));
        cprops.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        cprops.put(ConsumerConfig.GROUP_ID_CONFIG, env.getProperty("spring.kafka.consumer.group-id"));
        cprops.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
        cprops.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        return cprops;
    }

@Bean
    public ConsumerFactory<String, BizAck> bizAckConsumerFactory() {
        return new DefaultKafkaConsumerFactory<>(
                consumerConfigs(), new StringDeserializer(), new JsonDeserializer<>(BizAck.class));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, BizAck> bizAckKafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, BizAck> factory
                = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(bizAckConsumerFactory());
        return factory;
    }

@Bean
    public ConsumerFactory<String , DailyTransfer> consumerFactoryDailyTransfer(){
        Map<String, Object> config = new HashMap<>();
        config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, env.getProperty("spring.kafka.bootstrap-servers"));
        config.put(ConsumerConfig.GROUP_ID_CONFIG, env.getProperty("daily.transfer.consumer.group-id"));
        config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
     config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class);
        config.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
                new JsonDeserializer<>(DailyTransfer.class));
    }

@Bean(name="kafkaListenerContainerFactoryDailyTransfer")
    public ConcurrentKafkaListenerContainerFactory<String, DailyTransfer> kafkaListenerContainerFactoryDailyTransfer() {
        ConcurrentKafkaListenerContainerFactory<String, DailyTransfer> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactoryDailyTransfer());
        return factory;
    }

Listeners
---------
// listener to consume BizAck messages
    @KafkaListener( topics = "${spring.kafka.json.topic}", containerFactory = "bizAckKafkaListenerContainerFactory",
            groupId="${spring.kafka.consumer.group-id}")
    public void ssnitAckListener(BizAck bizAck) {
        logger.info("Received message='{}' from Kafka Topic", bizAck.toString());
        this.simpMessagingTemplate.convertAndSend("/bizTransfers/pushNotification", bizAck);
    }

// listener to consume DailyTransfer messages
@KafkaListener( topics="${spring.kafka.json.topic2}", containerFactory="kafkaListenerContainerFactoryDailyTransfer",
            groupId="${daily.transfer.consumer.group-id}" )
    public void dailyTransferListener(DailyTransfer dailyTransfer) {
        logger.info("Received message='{}' from transfer summary count Kafka Topic", dailyTransfer.toString());
        this.simpMessagingTemplate.convertAndSend("/summaryCounts/pushNotification", dailyTransfer);
    }

第一个侦听器,即使用BizAck消息的侦听器,工作正常。请参阅下面的日志

信息9708-[ntainer#1-0-C-1] gggsnkafka.BizAckTopicListener:收到的消息='BizAck {rgdno ='CS006192018',ack_message ='收到的企业注册:CS006192018', bizname ='DASEL ENGINEERING COMPANY LIMITED',tin ='C0010143181',incordate = '09 -JAN-2018',startdate = '09 -JAN-2018',biz_pk = 3667,ack_at ='2019-04-23T08:51:来自卡夫卡主题的02.684Z',ack_at_ms = 1556009462684}'

但是第二个侦听器,即消耗DailyTransfer消息的侦听器,抛出错误。

错误9708 --- [ntainer#0-0-C-1] oskafka.listener.LoggingErrorHandler:处理时出错:ConsumerRecord(topic = DAILY_TRANSFER_COUNTS,分区= 3,偏移量= 173,CreateTime = 1556009462652 ,序列化的键大小= 10,序列化的值大小= 51,标头= RecordHeaders(标头= [],isReadOnly = false),键= 2019-04-23,值= BizAck {rgdno ='null',ack_message ='null' ,bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null',ack_at_ms = null})

org.springframework.kafka.listener.ListenerExecutionFailedException:侦听器方法无法与传入消息一起调用 端点处理程序详细信息: 方法[public void gh.biztransfers.notification.kafka.DailyTransferTopicListener.dailyTransferListener(gh.biztransfers.notification.model.DailyTransfer)] Bean [gh.biztransfers.notification.kafka.DailyTransferTopicListener@14bd03ea];嵌套的异常是org.springframework.messaging.converter.MessageConversionException:无法处理消息。 嵌套的异常是org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage [payload = BizAck {rgdno]的[gh.biztransfers.notification.model.BizAck]转换为[gh.biztransfers.notification.model.DailyTransfer]。 ='null',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null',ack_at_ms = null},标头= {kafka_offset = 173,kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@db7f48,kafka_timestampType = CREATE_TIME,kafka_receivedMessageKey = 2019-04-23,kafka_ceivedceivedPartitionId = 3,Top = GenericMessage [有效载荷= BizAck {rgdno ='null',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at =' null',ack_at_ms = null},标头= {kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,ka fka_receivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_receivedTopic = DAILY_TRANSFER_COUNTS,kafka_receivedTimestamp = 1556009462652}]; 嵌套异常为org.springframework.messaging.converter.MessageConversionException:无法处理消息;嵌套异常是org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage [payload = BizAck {rgdno ='null”的[gh.biztransfers.notification.model.BizAck]转换为[gh.biztransfers.notification.model.DailyTransfer] ',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null',ack_at_ms = null},标头= { kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,kafka_receivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_ILst_Sentilar = Time = 654,GenefkaKaska = receivedTopic = 462,有效负载= BizAck {rgdno ='null',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null', ack_at_ms = null},标头= {kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,kafka_rece ivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_receivedTopic = DAILY_TRANSFER_COUNTS,kafka_receivedTimestamp = 1556009462652}]]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:1311)上[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.invokeErrorHandler(KafkaMessageListenerContainer.java:1300)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1227)上[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1198)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1118)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:933)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:749)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.run(KafkaMessageListenerContainer.java:698)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在java.util.concurrent.Executors $ RunnableAdapter.call(Executors.java:511)[na:1.8.0_192]     在java.util.concurrent.FutureTask.run(FutureTask.java:266)[na:1.8.0_192]     在java.lang.Thread.run(Thread.java:748)[na:1.8.0_192] 由以下原因引起:org.springframework.messaging.converter.MessageConversionException:无法处理消息。嵌套异常是org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage [payload = BizAck {rgdno ='null”的[gh.biztransfers.notification.model.BizAck]转换为[gh.biztransfers.notification.model.DailyTransfer] ',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null',ack_at_ms = null},标头= { kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,kafka_receivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_ILst_Sentilar = Time = 654,GenefkaKaska = receivedTopic = 462,有效负载= BizAck {rgdno ='null',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null', ack_at_ms = null},标头= {kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,kafka_rece ivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_receivedTopic = DAILY_TRANSFER_COUNTS,kafka_receivedTimestamp = 1556009462652}]]     在org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:292)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:79)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:50)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:1263)[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:1256)上[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1217)上[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     ...省略了8个通用框架 由以下原因引起:org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage的[gh.biztransfers.notification.model.BizAck]转换为[gh.biztransfers.notification.model.DailyTransfer] [payload = BizAck {rgdno ='null ',ack_message ='null',bizname ='null',tin ='null',incordate ='null',begindate ='null',biz_pk = null,ack_at ='null',ack_at_ms = null},标头= { kafka_offset = 173,kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer @ db7f48,kafka_timestampType = CREATE_TIME,kafka_receivedMessageKey = 2019-04-23,kafka_receivedPartitionId = 3,kafka_ILstSceiver == 652,     在org.springframework.messaging.handler.annotation.support.PayloadArgumentResolver.resolveArgument(PayloadArgumentResolver.java:144)〜[spring-messaging-5.1.6.RELEASE.jar:5.1.6.RELEASE]     在org.springframework.kafka.annotation.KafkaListenerAnnotationBeanPostProcessor $ KafkaHandlerMethodFactoryAdapter $ 1.resolveArgument(KafkaListenerAnnotationBeanPostProcessor.java:840)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:117)〜[spring-messaging-5.1.6.RELEASE.jar:5.1.6.RELEASE]     在org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:147)〜[spring-messaging-5.1.6.RELEASE.jar:5.1.6.RELEASE]     在org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:116)〜[spring-messaging-5.1.6.RELEASE.jar:5.1.6.RELEASE]     在org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:48)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     在org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:283)〜[spring-kafka-2.2.5.RELEASE.jar:2.2.5.RELEASE]     ...省略了13个常见框架

第二个侦听器为什么选择BizAck消息并尝试对其进行转换/处理?

**发送配置摘要

@Autowired
    private KafkaTemplate<String, BizAck> bizAckKafkaTemplate;
    public void sendAcknowledgementMessage(String rcvdMessage) {
        BizAck bizAck = utils.JsonStr2BizAck(rcvdMessage);
        logger.info("Sending acknowledgement message to Kafka for : \n"+ "Biz Regn: "+ bizAck.getRgdno() +", TIN : " + bizAck.getTin()+", Name: " + bizAck.getBizname());
        // the KafkaTemplate provides asynchronous send methods returning a Future
        ListenableFuture<SendResult<String, BizAck>> future = bizAckKafkaTemplate.send(Objects.requireNonNull(env.getProperty("spring.kafka.json.topic")), bizAck);
        // register a callback with the listener to receive the result of the send asynchronously
        future.addCallback(new ListenableFutureCallback<SendResult<String, BizAck>>() {
            @Override
            public void onSuccess(SendResult<String, BizAck> result) {
                logger.info("Successfully sent message=[ " + rcvdMessage + " ] with offset=[" + result.getRecordMetadata().offset() + "]");
            }
            @Override
            public void onFailure(Throwable ex) {
                logger.info("Unable to send message=[ " + rcvdMessage + " ] due to : " + ex.getMessage());
            }
        });
    }

发送日志摘录

2019-04-23 11:06:02.999信息9708 --- [enerContainer-1] g.g.g.s.n.kafka.AcknowledgementSender:将收到的确认消息发送给Kafka以便于: 商业登记:CG094562018,TIN:C0910331870,名称:发展社区 2019-04-23 11:06:02.999 INFO 9708 --- [广告| producer-1] gggsnkafka.AcknowledgementSender:已成功发送message = [{“ rgdno”:“ CG094562018”,“ bizname”:“社区发展”,“ tin”:“ C0910331870”,“ incordate”:“ 16-JAN- 2018“,” commencedate“:” 16-JAN-2018“,” biz_pk“:3800,” ack_at“:” 2019-04-23T11:06:02.858Z“,” ack_at_ms“:1556017562858,” ack_message“:” Biz收到的gn:CG002642018“}],偏移= [3556] 我已成功将这些消息发送到DAILY_TRANSFER_COUNTS以外的其他主题。 DAILY_TRANSFER_COUNTS主题源自在KSQL中对该主题执行的查询。**

1 个答案:

答案 0 :(得分:0)

很明显,第二个主题有一个BizAck ...

  

ConsumerRecord(topic = DAILY_TRANSFER_COUNTS,... value = BizAck {...

所以问题似乎出在发送方。

相关问题