Image Processing Reference
In-Depth Information
Interactivity and scripting support and active code written in Java and JavaScript are all
possible.
39.5.3
MPEG-4 FlexMux Timeline
The FlexMux provides a way to interleave data from separate elementary streams into one
serial bit stream. MPEG-2 was able to do this, but it lacked some significant capabilities for
ensuring that timeliness could be synchronized with external events.
The MPEG-4 design is scoped to manage a very large number of simultaneous
streams and solve interactivity and trigger synchronization problems that are very hard to
deal with when deploying MPEG-2 on digital video broadcast (DVB) systems, for example.
39.5.4
Object-Based Hierarchical Model
The MPEG-4 framework is an open-standards approach to storing video in an object-
based form. This is very much like the way that content is stored in a QuickTime movie,
and promotional statements from Apple state that “both MPEG-4 and QuickTime are from
the same DNA.”
This bodes very well for the availability of tools for authoring MPEG-4 content, as it
wouldn't be very hard to convert the export mechanisms that currently write QuickTime
files so that they make MP4 object container (BIFS) files instead.
39.5.5
Scene Descriptions
MPEG-4 places all the video objects into a scene description. That scene may actually just
describe a surface that is completely covered by a video object, so you would see no sig-
nificant difference from the way that an MPEG-2 video player works.
Significant benefits are available because this scenic environment is a true 3D space.
Very interesting presentations that look like a digital video effects (DVE) unit was used are
possible. The scenes are described in a language called BIFS, which is short for Binary
Format for Scenes. BIFS evolved from the VRML standard, which was in use in the late
1990s but was not widely deployed at that time.
39.5.6
Scene Construction
MPEG-4 BIFS provides some very sophisticated techniques for deconstructing a scene
into component objects. The foreground and background are transmitted separately
and then composited into a single image in the playback system. The background
might be a still image while the foreground is a presenter. Weather forecasts are often
made this way with a computer-generated graphic keyed into the background behind
a presenter.
MPEG-4 lets you deliver the images separately and with the coding efficiency taken
into account, you might reduce the bit rate to 20% or less compared with coding full-frame
video in MPEG-2.
Search WWH ::




Custom Search