I'm trying to implement a chunk oriented step in spring batch which retrieves the records from the database and writes each chunk to a seperate file.For instance, assume that there are 500 records in the DB. I need my job to create 10 files, having 50 records each.
PS: Main purpose is to create the 开发者_StackOverflow社区output files concurrently. Since ItemWriter implementations are not thread-safe, I decided to create seperate files as the output so that I can reduce the total amount of time spent for completing the step.
Does asnyone know how to implement it with Spring Batch? I found a sample project which processes multiple files in parallel using partitioning but that's not exactly what I want to do. In my case, the input is a single Table whereas the output is multiple files.
Here is the link: FileParallelProcessing
I found the answer,
partitionJdbcJob from spring batch samples does exactly what I want
Spring Batch Sample Job Source can be found here
I suggest you write a service which role will be to write your chunks in the files. This service will take one input : a single chunk. It's logic will be to write it in the file. You will write your multithreaded stuff for writing in this service.
Your batch will send chunk in a multithreaded way to this service.
So you gain benefit of multithreading via Spring Batch, and you keep the control of potential bugs by writing yourself the service which will ouput your chunks in files.
Use an ItemWriterAdapter to delegate the writing stuff to your service.
精彩评论