Information Technology Reference
In-Depth Information
5
Data Acquisition Using CEP
In this section the data acquisition for simulation using CEP is introduced, whereby
the consideration is on an abstract level without going into detail of a particular tool
or technique.
Complex event processing (CEP) is designed to deal with the increasing velocity,
variety, value and volume of data, known as Big Data. CEP is defined as a set of tools
and techniques for analyzing and controlling the complex series of interrelated events.
Thereby the events are processed as they happen, thus, continuously and in a timely
manner [19, 32]. An event is the central aspect of CEP and is defined as “ anything
that happens, or is contemplated as happening (change of state) ” [33], e.g. a RFID-
enabled good is recognized by a RFID reader. If an event summarizes, represents or
denotes a set of other events, it is a so-called “complex event”, e.g. a good left the
issuing area [33]. In this paper it is assumed that CEP is already used to monitor an
instantiated logistics network [6]. The next paragraph exemplifies this and emphasizes
the adequacy of applying CEP in the area of a 4PL.
The outsourced service between the 4PL and the customer as well as between the
4PL and the service providers is contractually secured. A contract records the agreed
upon obligations and responsibilities of contractual parties in terms of business
process conditions [34]. These conditions are often expressed as goals which must be
achieved by each party. The goals can be extracted from the customer needs or from
legal regulations and are known as Service Level Objectives (SLOs), which define
measurable indicators like delivery quality, delivery reliability or delivery flexibility.
The contract must exist in a formalized form, whereby the CEP engine is capable to
work with. This contract describes the target state of each logistics service (LS)
realized by the participants of the network and acts like a pattern. As soon as the
process execution is started (and thus instantiated) the 4PL has to ensure the
fulfillment of the defined SLOs. To achieve this, internal (e.g. good left the issuing
area) and external (e.g. traffic jam) data regarding to the good will be pushed to the
4PL. By doing this the 4PL can ensure that possible penalties (e.g. delayed or
damaged good) will be hand out to the “faulty” participant of the network. If it is not
traceable which participant of the network is the flaw, a logistics network would not
be robust and sustainable over a longer period. Furthermore, the use of CEP allows to
forecast, whether an instantiated process will meet the SLOs in the future or not [6].
The incoming data which describe the actual state is squared with the SLOs which
describe the target state. This comparison takes place within the CEP engine. All data
will be processed to evaluate the process execution of every logistics network partner
and build up service profiles. The service profiles include key performance indicators
which benchmark the logistics service providers (LSP) and their services. In contrast
to current instruments, this evaluation takes place during the process run-time and not
at the expiration (retirement, see Fig. 1) of a process. If delays are identified, an alarm
will be triggered and data about the failure will be raised, e.g. the duration or reasons
of a delivery delay. The following example and explanation should briefly describe
the suitability of CEP in the area of the 4PL business model at a more detailed level.
Fig. 4 illustrates a possible material and data flow of a specific logistics network.
As seen in the material flow layer, three LSP take part in the logistics network to
accomplish the contract between the 4PL and the customer. The squares emblematize
Search WWH ::




Custom Search