Databases Reference
In-Depth Information
19.13.1 ATM Networks
With the explosion of information, we have also seen the development of new ways of trans-
mitting the information. One of the most efficient ways of transferring information among a
large number of users is the use of asynchronous transfer mode (ATM) technology. In the past,
communication has usually taken place over dedicated channels; that is, in order to commu-
nicate between two points, a channel was dedicated only to transferring information between
those two points. Even if there was no information transfer going on during a particular period,
the channel could not be used by anyone else. Because of the inefficiency of this approach,
there is an increasing movement away from it. In an ATM network, the users divide their
information into packets, which are transmitted over channels that can be used by more than
one user.
We can draw an analogy between the movement of packets over a communication network
and the movement of automobiles over a road network. If we break up a message into packets,
then the movement of the message over the network is like the movement of a number of cars
on a highway system going from one point to the other. Although two cars may not occupy the
same position at the same time, they can occupy the same road at the same time. Thus, more
than one group of cars can use the road at any given time. Furthermore, not all the cars in the
group have to take the same route. Depending on the amount of traffic on the various roads
that run between the origin of the traffic and the destination, different cars can take different
routes. This is a more efficient utilization of the road than if the entire road was blocked off
until the first group of cars completed its traversal of the road.
Using this analogy, we can see that the availability of transmission capacity, that is, the
number of bits per second that we can transmit, is affected by factors that are outside our
control. If at a given time there is very little traffic on the network, the available capacity will
be high. On the other hand, if there is congestion on the network, the available capacity will
be low. Furthermore, the ability to take alternate routes through the network also means that
some of the packets may encounter congestion, leading to a variable amount of delay through
the network. In order to prevent congestion from impeding the flow of vital traffic, networks
will prioritize the traffic, with higher-priority traffic being permitted to move ahead of lower-
priority traffic. Users can negotiate with the network for a fixed amount of guaranteed traffic.
Of course, such guarantees tend to be expensive, so it is important that the user have some idea
about how much high-priority traffic they will be transmitting over the network.
19.13.2 Compression Issues in ATM Networks
In video coding, this situation provides both opportunities and challenges. In the video com-
pression algorithms discussed previously, there is a buffer that smooths the output of the
compression algorithm. Thus, if we encounter a high-activity region of the video and generate
more than the average number of bits per second, in order to prevent the buffer from over-
flowing, this period has to be followed by a period in which we generate fewer bits per second
than the average. Sometimes this may happen naturally, with periods of low activity following
periods of high activity. However, it is quite likely that this would not happen, in which case
we have to reduce the quality by increasing the step size or dropping coefficients, or maybe
even entire frames.
Search WWH ::




Custom Search