Image Processing Reference
In-Depth Information
for sensory experiences and to develop (semi-)automatic annotation techniques for
the generation and integration of sensory effects into media assets.
In [ 68 ] an end-to-end solution integrating sensory effects and interactive com-
ponents into a hybrid (internet-broadcast) 3DTV system is presented. In the exper-
imental setup deployed the main audiovisual content (showing an extended report
of a football match) is complemented with binaural audio, cut grass scent, ambient
lighting effects, and main lighting and shutter controllers (immersion dimension),
and with interactive 3D objects and meaningful content delivered through a second
screen (interaction dimension). A combination of broadcast-broadband transmis-
sion mechanisms is implemented to transmit this complementary content. At the
user
s premises, the content is delivered using the private IP network that connects
the receiver gateway with the visualization terminals and sensory devices. The
resulting system is compatible with current transmission (DVB-T), coding (AVC),
multiplexing (MPEG-2), signaling (DVB), and automation (KNX) standards.
The development and official release of the MPEG-V standard by the Moving
Picture Expert Group (MPEG) (and in particular, of its Part 3—Sensory Informa-
tion [ 84 ]) represents an important step in the consolidation of the sensory experi-
ence concept. The standard establishes the architecture and associated information
representations for the interaction and interoperability between virtual worlds (i.e.,
multimedia content) and real worlds through various sensors and actuators. The
Part 3 defines a set of sensory effects (e.g., light, temperature, wind, vibration,
touch) and associated semantics to deliver multi-sensorial content in association
with multimedia.
A recent Special Issue on MPEG-V, released on February 2003, gathers several
contributions proposing end-to-end frameworks that implement the standard for the
creation and delivery of sensory effects synchronized with audiovisual content.
Three relevant examples are those provided in [ 85 - 87 ]. In [ 85 ] an authoring tool
called SEVino is used for the generation of the SEM descriptions corresponding to
the different sensory effects introduced. The annotated content can be delivered
over various distribution channels and visualized in any MPEG-V-compliant
device. The SEM descriptions enable sensory effects to be rendered on off-the-
shelf hardware synchronized with the main audiovisual content, either in a stand-
alone application or in a web browser. Concerning the user experience, the authors
confirmed the hypotheses that sensory effects have a positive impact on the QoE
and on the intensity of emotions like happiness or fun.
The framework presented in [ 86 ] delivers sensory effects for home theaters
based on MPEG-V standard via the broadcast network. The paper discusses thor-
oughly the technical choices provided by the MPEG-V standard (and those adopted
in the targeted implementation) for the description, encoding, synchronization,
transport, adaptation, and rendering of sensory effects. The work in [ 87 ] also
exploits the broadcasting network capabilities to deliver a haptic-enabled system
based on the MPEG-V standard. The paper illustrates the data flow within the
system, which comprises four main stages: the creation of haptic contents using the
MPEG-V standard, their encoding/decoding using BIFS encoders/decoders, their
'
Search WWH ::




Custom Search