Our processor returns a List<?>
(effectively passing a List<List<?>>
) to our ItemWriter
.
Now, we observed that the JdbcBatchItemWriter
is not programmed to handle item instanceof List
. We also observed to process item instanceof List
; we need to write a custom ItemSqlParameterSourceProvider
.
But the sad part is that it returns SqlParameterSource
which can handle only one item
and again not capable of handling a List
.
So, can someone help us understand how to handle list of lists in the JdbcBatchItemWriter
?
Typically, the design pattern is:
Reader -> reads something, returns ReadItem
Processor -> ingests ReadItem, returns ProcessedItem
Writer -> ingests List<ProcessedItem>
If your processor is returning List<Object>
, then you need your Writer to expect List<List<Object>>
.
You could do this by wrapping your JdbcBatchItemWriter
as a delegate in an ItemWriter that looks something like this:
public class ListUnpackingItemWriter<T> implements ItemWriter<List<T>>, ItemStream, InitializingBean {
private ItemWriter<T> delegate;
@Override
public void write(final List<? extends List<T>> lists) throws Exception {
final List<T> consolidatedList = new ArrayList<>();
for (final List<T> list : lists) {
consolidatedList.addAll(list);
}
delegate.write(consolidatedList);
}
@Override
public void afterPropertiesSet() {
Assert.notNull(delegate, "You must set a delegate!");
}
@Override
public void open(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).open(executionContext);
}
}
@Override
public void update(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).update(executionContext);
}
}
@Override
public void close() {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).close();
}
}
public void setDelegate(ItemWriter<T> delegate) {
this.delegate = delegate;
}
}