Java Reference
In-Depth Information
Other Uses for Spring Batch
I bet by now you're wondering if replacing the mainframe is all Spring Batch is good for. When you think
about the projects you face on an ongoing basis, it isn't every day that you're ripping out COBOL code. If
that was all this framework was good for, it wouldn't be a very helpful framework. However, this
framework can help you with many other use cases.
The most common use case is data migration. As you rewrite systems, you typically end up
migrating data from one form to another. The risk is that you may write one-off solutions that are poorly
tested and don't have the data-integrity controls that your regular development has. However, when you
think about the features of Spring Batch, it seems like a natural fit. You don't have to do a lot of coding to
get a simple batch job up and running, yet Spring Batch provides things like commit counts and rollback
functionality that most data migrations should include but rarely do.
A second common use case for Spring Batch is any process that requires parallelized processing. As
chipmakers approach the limits of Moore's Law, developers realize that the only way to continue to
increase the performance of apps is not to process single transactions faster, but to process more
transactions in parallel. Many frameworks have recently been released that assist in parallel processing.
Apache Hadoop's MapReduce implementation, GridGain, and others have come out in recent years to
attempt to take advantage of both multicore processors and the numerous servers available via the
cloud. However, frameworks like Hadoop require you to alter your code and data to fit their algorithms
or data structures. Spring Batch provides the ability to scale your process across multiple cores or servers
(as shown in Figure 1-1 with master/slave step configurations) and still be able to use the same objects
and datasources that your web applications use.
Master Step
Slave Step
Slave Step
Slave Step
Figure 1-1. Simplifying parallel processing
Finally you come to constant or 24/7 processing. In many use cases, systems receive a constant or near-
constant feed of data. Although accepting this data at the rate it comes in is necessary for preventing
backlogs, when you look at the processing of that data, it may be more performant to batch the data into
chunks to be processed at once (as shown in Figure 1-2). Spring Batch provides tools that let you do this
type of processing in a reliable, scalable way. Using the framework's features, you can do things like read
messages from a queue, batch them into chunks, and process them together in a never-ending loop.
Thus you can increase throughput in high-volume situations without having to understand the complex
nuances of developing such a solution from scratch.
Search WWH ::

Custom Search