我如何使用@InboundChannelAdapter从两个不同的目录中读取文件(带有Java配置的Spring Integration)

时间:2018-09-08 02:15:27

标签: spring-integration spring-integration-sftp

我正在尝试实现以下方案:

我有一个触发器文件和一个数据文件,这些文件保存在不同的目录中。仅当我收到触发文件时,我才应该能够访问数据文件,然后执行拆分和进一步的处理逻辑。同样,情况是只有一个触发文件,但有多个数据文件。因此,在获取触发文件之后,我应该能够处理所有数据文件。

下面是我使用的代码,但它仅从一个目录中获取

private static final Logger LOGGER = LoggerFactory.getLogger(DatastreamApplication.class);
private static final String DATA_DIRECTORY_PATH = "dataDirectoryLocation";

@SuppressWarnings("deprecation")
public static void main(String[] args) {
    new SpringApplicationBuilder(DatastreamApplication.class).web(false).run(args);
}

@Bean
@InboundChannelAdapter(channel = "fileInputChannel", poller = @Poller(fixedDelay = "5000"))
public MessageSource<File> sftpMessageSource() {
    FileReadingMessageSource source = new FileReadingMessageSource();
    source.setDirectory(new File(DATA_DIRECTORY_PATH));
    source.setFilter(new AcceptOnceFileListFilter<>());
    return source;
}

@Splitter(inputChannel = "fileInputChannel")
@Bean
public FileSplitter fileSplitter() {
   FileSplitter fileSplitter = new FileSplitter();
   fileSplitter.setOutputChannelName("chunkingChannel");
   return fileSplitter;
}

@ServiceActivator(inputChannel = "chunkingChannel")
@Bean
public AggregatingMessageHandler  chunker() {
    AggregatingMessageHandler aggregator = new AggregatingMessageHandler(new DefaultAggregatingMessageGroupProcessor());
    aggregator.setReleaseStrategy(new MessageCountReleaseStrategy(1000));
    aggregator.setExpireGroupsUponCompletion(true);
    aggregator.setGroupTimeoutExpression(new ValueExpression<>(100L));
    aggregator.setSendPartialResultOnExpiry(true);
    aggregator.setOutputChannelName("processFileChannel");
    return aggregator;
}

@Bean
@ServiceActivator(inputChannel = "processFileChannel")
public MessageHandler handler() {
    return new MessageHandler() {

        @Override
        public void handleMessage(Message<?> message) throws MessagingException {
            List<String> strings = (List<String>) message.getPayload();
            System.out.println( "List Size :  "+ strings.size() + " for List " + strings.toString());
        }

    };
}

1 个答案:

答案 0 :(得分:0)

在这种情况下,听起来更好的解决方案可能是使用入站适配器查找触发器文件,然后使用自定义服务(由服务激活器调用)以返回数据中文件的列表。目录。

编辑

@SpringBootApplication
public class So52231415Application {

    public static void main(String[] args) {
        new File("/tmp/bar/baz1.txt").delete();
        new File("/tmp/bar/baz2.txt").delete();
        new File("/tmp/foo/baz.trigger").delete();
        SpringApplication.run(So52231415Application.class, args);
    }

    @Bean
    public IntegrationFlow flow() {
        return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))
                    .filterExpression("name.endsWith('.trigger')"), e -> e.poller(Pollers.fixedDelay(5_000)))
                .<File, List<File>>transform(f -> {
                    File[] files = new File("/tmp/bar").listFiles();
                    String prefix = f.getName().substring(0, f.getName().lastIndexOf('.'));
                    return Arrays.stream(files)
                        .filter(ff -> ff.getName().startsWith(prefix))
                        .collect(Collectors.toList());
                })
                .split()
                .headerFilter(FileHeaders.ORIGINAL_FILE, FileHeaders.FILENAME, FileHeaders.RELATIVE_PATH)
                .split(new FileSplitter())
                .handle(System.out::println)
                .get();
    }

    @Bean
    public ApplicationRunner runner() {
        return args -> {
            FileOutputStream fos = new FileOutputStream(new File("/tmp/bar/baz1.txt"));
            fos.write("one\ntwo\nthree\n".getBytes());
            fos.close();
            fos = new FileOutputStream(new File("/tmp/bar/baz2.txt"));
            fos.write("four\nfive\nsix\n".getBytes());
            fos.close();
            fos = new FileOutputStream(new File("/tmp/foo/baz.trigger"));
            fos.write("\n".getBytes());
            fos.close();
        };
    }

}

GenericMessage [payload=one, headers={sequenceNumber=1, sequenceDetails=[[a06d78dc-8a29-34aa-d8c1-64468edded5b, 1, 2]], sequenceSize=0, file_name=baz1.txt, correlationId=6581c159-352a-b943-d325-db58761b573d, file_originalFile=/tmp/bar/baz1.txt, id=6663d227-ee87-c997-6c67-01b4e50f7b6a, timestamp=1538584697084}]
GenericMessage [payload=two, headers={sequenceNumber=2, sequenceDetails=[[a06d78dc-8a29-34aa-d8c1-64468edded5b, 1, 2]], sequenceSize=0, file_name=baz1.txt, correlationId=6581c159-352a-b943-d325-db58761b573d, file_originalFile=/tmp/bar/baz1.txt, id=d48caf6e-5f3c-76cc-98a8-63800ee1acb8, timestamp=1538584697084}]
GenericMessage [payload=three, headers={sequenceNumber=3, sequenceDetails=[[a06d78dc-8a29-34aa-d8c1-64468edded5b, 1, 2]], sequenceSize=0, file_name=baz1.txt, correlationId=6581c159-352a-b943-d325-db58761b573d, file_originalFile=/tmp/bar/baz1.txt, id=8486ed52-d6a3-6eaf-20de-555b18ea0d75, timestamp=1538584697084}]
GenericMessage [payload=four, headers={sequenceNumber=1, sequenceDetails=[[a06d78dc-8a29-34aa-d8c1-64468edded5b, 2, 2]], sequenceSize=0, file_name=baz2.txt, correlationId=5842bfd7-f25c-9faf-e2f8-7b2a2badf869, file_originalFile=/tmp/bar/baz2.txt, id=3d301ccf-2b80-d4a4-4c59-5f9e38180fe5, timestamp=1538584697087}]
GenericMessage [payload=five, headers={sequenceNumber=2, sequenceDetails=[[a06d78dc-8a29-34aa-d8c1-64468edded5b, 2, 2]], sequenceSize=0, file_name=baz2.txt, correlationId=5842bfd7-f25c-9faf-e2f8-7b2a2badf869, file_originalFile=/tmp/bar/baz2.txt, id=4e43f02c-d64e-0d11-18a4-3ab8b6e7d383, timestamp=1538584697087}]