Java Reference
In-Depth Information
public String toString() {
StringBuilder output = new StringBuilder();
output.append(" ");
output.append(". ");
if(transactions != null&& transactions.size() > 0) {
output.append(" has ");
output.append(" transactions.");
} else {
output.append(" has no transactions.");
return output.toString();
With a run of the job, you can see each of your customers and the number of transaction records
you read in. It's important to note that when reading records in this way, the customer record and all the
subsequent transaction records are considered a single item. The reason for this is that Spring Batch
considers an item to be any object that is returned by the ItemReader. In this case, the Customer object is
the object returned by the ItemReader so it is the item used for things like commit counts, etc. Each
Customer object will be processed once by any configured ItemProcessor you add and once by any
configured ItemWriter. The output from the job configured with the new ItemReaders can be seen in
Listing 7-30.
Listing 7-30. Output from Multiline Job
Warren Q. Darrow has 1 transactions.
Ann V. Gates has no transactions.
Erica I. Jobs has 5 transactions.
Multiline records are a common element in batch processing. Although they are a bit more complex
than basic record processing, as you can see from this example, there is still only a minimal amount of
actual code that needs to be written to handle these robust situations.
The last piece of the flat file puzzle is to look at input situations where you read in from multiple
files. This is a common requirement in the batch world and it's covered in the next section.
Multiple Sources
The examples up to this point have been based around a customer file with transactions for each
customer. Many companies have multiple departments or locations that sell things. Take, for example, a
restaurant chain with restaurants nationwide. Each location may contribute a file with the same format
Search WWH ::

Custom Search