如何防止ListenerExecutionFailedException:Listener抛出异常

时间:2017-10-06 16:39:30

标签: spring-integration spring-batch spring-amqp

我需要做些什么来防止以下由RabbitMQ抛出的异常。

org.springframework.amqp.rabbit.listener.exception.ListenerExecutionFailedException: Listener threw exception
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.wrapToListenerExecutionFailedExceptionIfNeeded(AbstractMessageListenerContainer.java:877)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:787)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:707)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$001(SimpleMessageListenerContainer.java:98)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$1.invokeListener(SimpleMessageListenerContainer.java:189)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.invokeListener(SimpleMessageListenerContainer.java:1236)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.executeListener(AbstractMessageListenerContainer.java:688)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.doReceiveAndExecute(SimpleMessageListenerContainer.java:1190)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.receiveAndExecute(SimpleMessageListenerContainer.java:1174)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer.access$1200(SimpleMessageListenerContainer.java:98)
    at org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer$AsyncMessageProcessingConsumer.run(SimpleMessageListenerContainer.java:1363)
    at java.lang.Thread.run(Thread.java:748)
    Caused by: org.springframework.messaging.MessageDeliveryException: failed to send Message to channel 'amqpLaunchSpringBatchJobFlow.channel#0'; nested exception is jp.ixam_drive.batch.service.JobExecutionRuntimeException: Failed to start job with name ads-insights-import and parameters {accessToken=<ACCESS_TOKEN>, id=act_1234567890, classifier=stats, report_run_id=1482330625184792, job_request_id=32}
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:449)
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:373)
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:115)
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:45)
    at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:105)
    at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:171)
    at org.springframework.integration.amqp.inbound.AmqpInboundChannelAdapter.access$400(AmqpInboundChannelAdapter.java:45)
    at org.springframework.integration.amqp.inbound.AmqpInboundChannelAdapter$1.onMessage(AmqpInboundChannelAdapter.java:95)
    at org.springframework.amqp.rabbit.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:784)
    ... 10 common frames omitted
    Caused by: jp.ixam_drive.batch.service.JobExecutionRuntimeException: Failed to start job with name ads-insights-import and parameters {accessToken=<ACCESS_TOKEN>, id=act_1234567890, classifier=stats, report_run_id=1482330625184792, job_request_id=32}
    at jp.ixam_drive.facebook.SpringBatchLauncher.launchJob(SpringBatchLauncher.java:42)
    at jp.ixam_drive.facebook.AmqpBatchLaunchIntegrationFlows.lambda$amqpLaunchSpringBatchJobFlow$1(AmqpBatchLaunchIntegrationFlows.java:71)
    at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116)
    at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:148)
    at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:121)
    at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:89)
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:423)
    ... 18 common frames omitted
    Caused by: org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={accessToken=<ACCESS_TOKEN>, id=act_1234567890, classifier=stats, report_run_id=1482330625184792, job_request_id=32}.  If you want to run this job again, change the parameters.
    at org.springframework.batch.core.repository.support.SimpleJobRepository.createJobExecution(SimpleJobRepository.java:126)
    at sun.reflect.GeneratedMethodAccessor193.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
    at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99)
    at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282)
    at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
    at org.springframework.batch.core.repository.support.AbstractJobRepositoryFactoryBean$1.invoke(AbstractJobRepositoryFactoryBean.java:172)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
    at com.sun.proxy.$Proxy125.createJobExecution(Unknown Source)
    at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:125)
    at jp.ixam_drive.batch.service.JobOperationsService.launch(JobOperationsService.java:64)
    at jp.ixam_drive.facebook.SpringBatchLauncher.launchJob(SpringBatchLauncher.java:37)
    ... 24 common frames omitted

当我有两个Spring Boot应用程序实例时,它们都并行运行以下代码来执行Spring Batch Jobs?

@Configuration
@Conditional(AmqpBatchLaunchCondition.class)
@Slf4j
public class AmqpAsyncAdsInsightsConfiguration {

    @Autowired
    ObjectMapper objectMapper;

    @Value("${batch.launch.amqp.routing-keys.async-insights}")
    String routingKey;

    @Bean
    public IntegrationFlow amqpOutboundAsyncAdsInsights(AmqpTemplate amqpTemplate) {
        return IntegrationFlows.from("async_ads_insights")
                .<JobParameters, byte[]>transform(SerializationUtils::serialize)
                .handle(Amqp.outboundAdapter(amqpTemplate).routingKey(routingKey)).get();
    }

    @Bean
    public IntegrationFlow amqpAdsInsightsAsyncJobRequestFlow(FacebookMarketingServiceProvider serviceProvider,
            JobParametersToApiParametersTransformer transformer, ConnectionFactory connectionFactory) {
        return IntegrationFlows.from(Amqp.inboundAdapter(connectionFactory, routingKey))
                .<byte[], JobParameters>transform(SerializationUtils::deserialize)
                .<JobParameters, ApiParameters>transform(transformer)
                .<ApiParameters>handle((payload, header) -> {
                    String accessToken = (String) header.get("accessToken");
                    String id = (String) header.get("object_id");
                    FacebookMarketingApi api = serviceProvider.getApi(accessToken);
                    String reportRunId = api.asyncRequestOperations().getReportRunId(id, payload.toMap());
                    ObjectNode objectNode = objectMapper.createObjectNode();
                    objectNode.put("accessToken", accessToken);
                    objectNode.put("id", id);
                    objectNode.put("report_run_id", reportRunId);
                    objectNode.put("classifier", (String) header.get("classifier"));
                    objectNode.put("job_request_id", (Long) header.get("job_request_id"));
                    return serialize(objectNode);
                }).channel("ad_report_run_polling_channel").get();
    }

    @SneakyThrows
    private String serialize(JsonNode jsonNode) {
        return objectMapper.writeValueAsString(jsonNode);
    }
}

@Configuration
@Conditional(AmqpBatchLaunchCondition.class)
@Slf4j
public class AmqpBatchLaunchIntegrationFlows {

    @Autowired
    SpringBatchLauncher batchLauncher;

    @Value("${batch.launch.amqp.routing-keys.job-launch}")
    String routingKey;

    @Bean(name = "batch_launch_channel")
    public MessageChannel batchLaunchChannel() {
        return MessageChannels.executor(Executors.newSingleThreadExecutor()).get();
    }

    @Bean
    public IntegrationFlow amqpOutbound(AmqpTemplate amqpTemplate,
            @Qualifier("batch_launch_channel") MessageChannel batchLaunchChannel) {
        return IntegrationFlows.from(batchLaunchChannel)
                .<JobParameters, byte[]>transform(SerializationUtils::serialize)
                .handle(Amqp.outboundAdapter(amqpTemplate).routingKey(routingKey)).get();
    }

    @Bean
    public IntegrationFlow amqpLaunchSpringBatchJobFlow(ConnectionFactory connectionFactory) {
        return IntegrationFlows.from(Amqp.inboundAdapter(connectionFactory, routingKey))
                .handle(message -> {
                    String jobName = (String) message.getHeaders().get("job_name");
                    byte[] bytes = (byte[]) message.getPayload();
                    JobParameters jobParameters = SerializationUtils.deserialize(bytes);
                    batchLauncher.launchJob(jobName, jobParameters);
                }).get();
    }
}

@Configuration
@Slf4j
public class AsyncAdsInsightsConfiguration {

    @Value("${batch.core.pool.size}")
    public Integer batchCorePoolSize;

    @Value("${ixam_drive.facebook.api.ads-insights.async-poll-interval}")
    public String asyncPollInterval;

    @Autowired
    ObjectMapper objectMapper;

    @Autowired
    private DataSource dataSource;

    @Bean(name = "async_ads_insights")
    public MessageChannel adsInsightsAsyncJobRequestChannel() {
        return MessageChannels.direct().get();
    }

    @Bean(name = "ad_report_run_polling_channel")
    public MessageChannel adReportRunPollingChannel() {
        return MessageChannels.executor(Executors.newFixedThreadPool(batchCorePoolSize)).get();
    }

    @Bean
    public IntegrationFlow adReportRunPollingLoopFlow(FacebookMarketingServiceProvider serviceProvider) {
        return IntegrationFlows.from(adReportRunPollingChannel())
                .<String>handle((payload, header) -> {
                    ObjectNode jsonNode = deserialize(payload);
                    String accessToken = jsonNode.get("accessToken").asText();
                    String reportRunId = jsonNode.get("report_run_id").asText();
                    try {
                        AdReportRun adReportRun = serviceProvider.getApi(accessToken)
                                .fetchObject(reportRunId, AdReportRun.class);
                        log.debug("ad_report_run: {}", adReportRun);
                        return jsonNode.set("ad_report_run", objectMapper.valueToTree(adReportRun));
                    } catch (Exception e) {
                        log.error("failed while polling for ad_report_run.id: {}", reportRunId);
                        throw new RuntimeException(e);
                    }
                }).<JsonNode, Boolean>route(payload -> {
                    JsonNode adReportRun = payload.get("ad_report_run");
                    return adReportRun.get("async_percent_completion").asInt() == 100 &&
                            "Job Completed".equals(adReportRun.get("async_status").asText());
                }, rs -> rs.subFlowMapping(true,
                        f -> f.transform(JsonNode.class,
                                source -> {
                                    JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
                                    jobParametersBuilder
                                            .addString("accessToken", source.get("accessToken").asText());
                                    jobParametersBuilder.addString("id", source.get("id").asText());
                                    jobParametersBuilder
                                            .addString("classifier", source.get("classifier").asText());
                                    jobParametersBuilder
                                            .addLong("report_run_id", source.get("report_run_id").asLong());
                                    jobParametersBuilder
                                            .addLong("job_request_id", source.get("job_request_id").asLong());
                                    return jobParametersBuilder.toJobParameters();
                                }).channel("batch_launch_channel"))
                        .subFlowMapping(false,
                                f -> f.transform(JsonNode.class, this::serialize)
                                        .<String>delay("delay", asyncPollInterval, c -> c.transactional()
                                                .messageStore(jdbcMessageStore()))
                                        .channel(adReportRunPollingChannel()))).get();
    }

    @SneakyThrows
    private String serialize(JsonNode jsonNode) {
        return objectMapper.writeValueAsString(jsonNode);
    }

    @SneakyThrows
    private ObjectNode deserialize(String payload) {
        return objectMapper.readerFor(ObjectNode.class).readValue(payload);
    }

    @Bean
    public JdbcMessageStore jdbcMessageStore() {
        JdbcMessageStore jdbcMessageStore = new JdbcMessageStore(dataSource);
        return jdbcMessageStore;
    }

    @Bean
    public JobParametersToApiParametersTransformer jobParametersToApiParametersTransformer() {
        return new JobParametersToApiParametersTransformer() {
            @Override
            protected ApiParameters transform(JobParameters jobParameters) {
                ApiParameters.ApiParametersBuilder builder = ApiParameters.builder();
                MultiValueMap<String, String> multiValueMap = new LinkedMultiValueMap<>();
                String level = jobParameters.getString("level");
                if (!StringUtils.isEmpty(level)) {
                    multiValueMap.set("level", level);
                }
                String fields = jobParameters.getString("fields");
                if (!StringUtils.isEmpty(fields)) {
                    multiValueMap.set("fields", fields);
                }
                String filter = jobParameters.getString("filter");
                if (filter != null) {
                    try {
                        JsonNode jsonNode = objectMapper.readTree(filter);
                        if (jsonNode != null && jsonNode.isArray()) {
                            List<ApiFilteringParameters> filteringParametersList = new ArrayList<>();
                            List<ApiSingleValueFilteringParameters> singleValueFilteringParameters = new ArrayList<>();
                            ArrayNode arrayNode = (ArrayNode) jsonNode;
                            arrayNode.forEach(node -> {
                                String field = node.get("field").asText();
                                String operator = node.get("operator").asText();
                                if (!StringUtils.isEmpty(field) && !StringUtils.isEmpty(operator)) {
                                    String values = node.get("values").asText();
                                    String[] valuesArray = !StringUtils.isEmpty(values) ? values.split(",") : null;
                                    if (valuesArray != null) {
                                        if (valuesArray.length > 1) {
                                            filteringParametersList.add(ApiFilteringParameters
                                                    .of(field, Operator.valueOf(operator), valuesArray));
                                        } else {
                                            singleValueFilteringParameters.add(ApiSingleValueFilteringParameters
                                                    .of(field, Operator.valueOf(operator), valuesArray[0]));
                                        }
                                    }
                                }
                            });
                            if (!filteringParametersList.isEmpty()) {
                                builder.filterings(filteringParametersList);
                            }
                            if (!singleValueFilteringParameters.isEmpty()) {
                                builder.filterings(singleValueFilteringParameters);
                            }
                        }

                    } catch (IOException e) {
                        throw new UncheckedIOException(e);
                    }
                }
                String start = jobParameters.getString("time_ranges.start");
                String end = jobParameters.getString("time_ranges.end");
                String since = jobParameters.getString("time_range.since");
                String until = jobParameters.getString("time_range.until");

                if (!StringUtils.isEmpty(start) && !StringUtils.isEmpty(end)) {
                    builder.timeRanges(ApiParameters.timeRanges(start, end));
                } else if (!StringUtils.isEmpty(since) && !StringUtils.isEmpty(until)) {
                    builder.timeRange(new TimeRange(since, until));
                }
                String actionBreakdowns = jobParameters.getString("action_breakdowns");
                if (!StringUtils.isEmpty(actionBreakdowns)) {
                    multiValueMap.set("action_breakdowns", actionBreakdowns);
                }
                String attributionWindows = jobParameters.getString("action_attribution_windows");
                if (attributionWindows != null) {
                    try {
                        multiValueMap
                                .set("action_attribution_windows",
                                        objectMapper.writeValueAsString(attributionWindows.split(",")));
                    } catch (JsonProcessingException e) {
                        e.printStackTrace();
                    }
                }
                builder.multiValueMap(multiValueMap);
                String pageSize = jobParameters.getString("pageSize");
                if (!StringUtils.isEmpty(pageSize)) {
                    builder.limit(pageSize);
                }
                return builder.build();
            }
        };
    }
}

以下是消息流的方式:

   1. channel[async_ads_insights] ->IntegrationFlow[amqpOutboundAsyncAdsInsights]->[AMQP]->IntegrationFlow[amqpAdsInsightsAsyncJobRequestFlow]->channel[ad_report_run_polling_channel]->IntegrationFlow[adReportRunPollingLoopFlow]-IF END LOOP->channel[batch_launch_channel] ELSE -> channel[ad_report_run_polling_channel]

   2. channel[batch_launch_channel] -> IntegrationFlow[amqpOutbound]-> IntegrationFlow[amqpLaunchSpringBatchJobFlow]

   3. Spring Batch Job is launched.

在两个实例启动后立即抛出异常,但过了一会儿。启动Spring批处理作业确实成功,但随后开始失败&#34;作业实例已经存在且已完成...&#34;

这项工作是检索Facebook广告结果。

感谢您对导致上述错误的原因的深入了解。

我也有这样的配置,不使用AMQP并且没有任何问题,但它只适用于一个实例。

@Configuration
@Conditional(SimpleBatchLaunchCondition.class)
@Slf4j
public class SimpleBatchLaunchIntegrationFlows {

    @Autowired
    SpringBatchLauncher batchLauncher;

    @Autowired
    DataSource dataSource;

    @Bean(name = "batch_launch_channel")
    public MessageChannel batchLaunchChannel() {
        return MessageChannels.queue(jdbcChannelMessageStore(), "batch_launch_channel").get();
    }

    @Bean
    public ChannelMessageStoreQueryProvider channelMessageStoreQueryProvider() {
        return new MySqlChannelMessageStoreQueryProvider();
    }

    @Bean
    public JdbcChannelMessageStore jdbcChannelMessageStore() {
        JdbcChannelMessageStore channelMessageStore = new JdbcChannelMessageStore(dataSource);
        channelMessageStore.setChannelMessageStoreQueryProvider(channelMessageStoreQueryProvider());
        channelMessageStore.setUsingIdCache(true);
        channelMessageStore.setPriorityEnabled(true);
        return channelMessageStore;
    }

    @Bean
    public IntegrationFlow launchSpringBatchJobFlow(@Qualifier("batch_launch_channel")
            MessageChannel batchLaunchChannel) {
        return IntegrationFlows.from(batchLaunchChannel)
                .handle(message -> {
                    String jobName = (String) message.getHeaders().get("job_name");
                    JobParameters jobParameters = (JobParameters) message.getPayload();
                    batchLauncher.launchJob(jobName, jobParameters);
                }, e->e.poller(Pollers.fixedRate(500).receiveTimeout(500))).get();
    }
}

1 个答案:

答案 0 :(得分:1)

请参阅Spring Batch文档。启动作业的新实例时,作业参数必须是唯一的。

一个常见的解决方案是添加一个带有UUID或类似参数的虚拟参数,但是批处理提供了一种策略,例如每次都增加一个数字参数。

修改

有一类特殊情况,其中的成员被认为是不可恢复的(致命的),尝试重新发送是没有意义的。

示例包括MessageConversionException - 如果我们无法在第一时间进行转化,我们可能无法转换。 ConditionalRejectingErrorHandler是我们检测此类异常的机制,并导致它们被永久拒绝(而不是重新传递)。

其他异常导致默认情况下重新传递消息 - 还有另一个属性defaultRequeuRejected可以设置为false以永久拒绝所有失败(不推荐)。

您可以通过继承其DefaultExceptionStrategy - 覆盖isUserCauseFatal(Throwable cause)来自定义错误处理程序,以扫描cause树以查找JobInstanceAlreadyCompleteException并返回true({{1} })

  

我认为它是由已经运行的&#34; SpringBatch作业引发的错误触发的&#34;例外。

这仍然表明你已经以某种方式接收到具有相同参数的第二条消息;它是一个不同的错误,因为原来的工作仍然在运行;该消息被拒绝(并重新排队),但在后续交付中,您将获得已完成的异常。

所以,我仍然说你的问题的根本原因是重复请求,但你可以避免在通道适配器的监听器容器中使用自定义错误处理程序的行为。

我建议您记录重复的邮件,以便找出获取邮件的原因。