Image Processing Reference
combines packets of encoded data from a variety of elementary sources and creates a sin-
gle stream with them interleaved together.
That audio and video track pair might then have some data tracks added to carry
URL and subtitle information and might then be prepared for streaming over the Web. The
hinting process runs through a file-based version of the stream and adds information that
the streaming server uses when delivering the content to the client player.
For live streaming, all of the encoding has to happen on the fly. Moreover, it has to hap-
pen quickly enough that the outgoing stream keeps pace with the incoming video or you
get a buffer overflow situation. That would cause the encoder to run out of capacity and
either have to stop encoding or jump to a new location having missed a section. The result-
ing effect is a very unsatisfactory viewing experience.
To avoid this you need to make sure you are running the encoding and streaming
process on a powerful enough computer. The modern codecs deliver a higher compression
ratio than their older counterparts but this is at the expense of requiring significantly more
compute power. Therefore the choice of codec and compute platform makes a lot of dif-
ference to your ability to serve live-streamed content.
Encoding for Broadcast TV
When encoding video for broadcast TV use, a hardware-based solution is used for most
scenarios. The coding and multiplexing together of multiple program streams is a complex
process and the architecture of your streaming service determines how effectively you uti-
lize the available bit rate.
Broadcast TV services are delivered within a fixed bit rate for the entire multiplex or
transponder. Within that fixed bit rate, individual streams may be coded with variable bit
rates, constant bit rates, or statistically multiplexed variable bit rates. Figure 15-1 illus-
trates the complexity of a DSat broadcast and even this example is simplified somewhat.
You can see that some channels are available only for part of the day and their capacity is
used for quite different things, depending on when you are viewing.
Inserting Additional Data Feeds
Depending on where your streamed service is to be deployed, you may add supplemen-
tary data feeds to the outgoing stream. There are several likely alternatives here. Any data
that must be synchronized with the video should be added as the video is encoded.
Postprocessing is possible when encoding to a file since the content is managed offline, but
when live encoding, the data must be coded at the same time. Subtitle streams, for exam-
ple, may actually be typed by operators in real time or might be taken from the script that