Information Technology Reference
In-Depth Information
The only solution is to decrypt the packets at each network access point, read, copy
and analyze header information and then re-encrypt the packets before sending it on.
In addition to the added computation expense, this defeats the purpose of the trusted
relations between the communicating parties and exposes them to immense risks.
Instead of providing security by increasing visibility on the encrypted packets, this
solution actually created new points of failure from the information security's
perspective.
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
Fig. 3. Encrypted traffic visibility problem for firewall audit
4 Intrusion Detection Systems (IDS)
Large-scale heterogeneous networks generate tremendous amounts of temporal event
data in very diverse formats. In reality, much of these data has very little to do with
security at all. Most of them are related to system/network faults as a result of wrong
configuration. When doing analysis, only careful analysis can distinguish between
security and non-security data. This is an extremely noisy environment. When IDS
attempts to analyze and correlate these events, correct interpretation of the event
semantics becomes very important to minimize false positives (false alarms).
1. Intrusion detection architecture . Today's IDS products depend heavily on
centralized event processing — a traditional passive and one-way information-
processing architecture. IDS sensors are placed at many locations in the network.
The sensors' role is to collect data and perform simple analysis. Bulk of analysis,
discovering and correlation are done at the centralized monitoring engine. This
architecture faces considerable challenge when scaling up to meet the demand of
today's large and complex networks. Too much burden is being placed on the
central machine to perform the analysis. Also, in a centralized event-processing
architecture, by the time huge amount of data arrives at the centralized location the
contextual information needed to properly analyze the event has already been lost.
That information existed only in the original environments where data were
generated. Without the right information for interpretation, it is difficult to perform
adequate correlation. Worse even, the time latency might have made it impossible
Search WWH ::




Custom Search