Java Reference
In-Depth Information
You renamed the channel and employed an outbound-channel-adapter instead of an inbound-
channel-adapter . Indeed, the only thing novel is that you're employing the method attribute on the
outbound-channel-adapter element to give the component an extra level of insulation from the Spring
Integration APIs.
Using the component is easy, and a quick test might look like this:
public static void main(String[] args) throws Throwable {
ClassPathXmlApplicationContext classPathXmlApplicationContext = new
ClassPathXmlApplicationContext( "solution032.xml");
classPathXmlApplicationContext.start();
DirectChannel channel = (DirectChannel)
classPathXmlApplicationContext.getBean("outboundTweets");
Message<String> helloWorldMessage =
MessageBuilder.withPayload( "Hello, world!").build();
channel.send(helloWorldMessage);
}
The example's even simpler than the test code for the inbound adapter! The code goes through
the motions of setting up a Message and then simply sends it. Confirm by checking your status on
twitter.com.
8-11. Staging Events Using Spring Batch
Problem
You have a file with a million records in it.
Solution
Spring Batch works very well with these types of solutions. It allows you to take an input file or a payload
and reliably, and systematically, decompose it into events that an ESB can work with.
How It Works
Spring Integration does support reading files into the bus, and Spring Batch does support providing
custom, unique endpoints for data. However, just like mom always says, “just because you can, it
doesn't mean you should .”
Although it seems as if there's a lot of overlap here, it turns out that there is a distinction (albeit a
fine one). While both systems will work with files and message queues, or anything else you could
conceivably write code to talk to, Spring Integration doesn't do well with large payloads because it's hard
to deal with something as large as a file with a million rows that might require hours of work as an event .
That's simply too big a burden for an ESB. At that point, the term event has no meaning. A million
records in a CSV file isn't an event on a bus, it's a file with a million records, each of which might in turn
be events.
It's a subtle distinction.
A file with a million rows needs to be decomposed into smaller events. Spring Batch can help here:
it allows you to systematically read through, apply validations, and optionally skip and retry invalid
records. The processing can begin on an ESB such as Spring Integration. Spring Batch and Spring
Integration can be used together to build truly scalable decoupled systems.
 
Search WWH ::




Custom Search