Java Reference
In-Depth Information
Introducing the MMAPI
From relatively early in the history of mobile Java, hardware vendors and Sun worked
together to explore how Java should support the rendering of multimedia content.
However, as participants in the Java community were hammering out the details of the
MMAPI, the hardware available could barely render digital audio, let alone video. Every-
one soon recognized the important role that multimedia support would play for many
Java ME applications.
The MMAPI features a modular API that enables you to both create your own simple
audio media through the specification of frequency and tone data as well as render mul-
timedia audio and video using a variety of codecs. Because the MMAPI is extensible,
hardware manufacturers can add new codecs and transports so that you can use the
same interfaces to render new data formats as they become available.
Note Limitations of space prevent me from thoroughly covering everything you can do with the MMAPI.
If you find yourself wanting to learn more about the MMAPI after you read this chapter, I encourage you to
read Pro Java ME MMAPI: Mobile Media API for Java Micro Edition by Vikram Goyal (Apress, 2006), which
goes into far more detail about how you can use the MMAPI in your Java ME applications.
Understanding Basic Multimedia Concepts
To use the MMAPI successfully, you must understand three key concepts about how it
divides the responsibilities of media rendering. The first concept has to do with how
creators and distributors package and deliver multimedia content. Content creators (be
they major companies or individuals posting to social sites like YouTube) use specific
media types when exchanging multimedia content. Today's media types are typically
sophisticated file formats that act as containers for highly compressed streams of audio-
visual data. For example, the popular Moving Picture Experts Group Version 4 (MPEG-4)
standard defines not just one but a suite of audio and video coding formats; an MPEG-4
file may have an audio stream in Advanced Audio Coding (AAC) synchronized with a
H.263 video stream, or it may be PureVoice audio synchronized with an MPEG-4 video
stream, or it may be something else altogether. Thus, when specifying what kinds of
multimedia their hardware can render, device vendors typically specify the information
in terms of both container formats and codecs. Typically, most mobile devices today rely
on a complex combination of hardware through digital signal processors (DSPs), dedi-
cated integrated circuits, and so forth, as well as software to implement the codecs
necessary for rendering multimedia. The implementation of a specific MMAPI stack
interconnects with this hardware and software to provide highly efficient mechanisms
for rendering a variety of multimedia formats.
 
Search WWH ::




Custom Search