Java Reference
In-Depth Information
Wrapping Up
Through optional APIs documented in JSRs 135 and JSR 287, the Java ME platform
provides your applications the capability to render rich audio and video in a variety of
different formats, including WAV, MPEG, and SVG. Although the MMAPI that JSR 135
defines is fundamentally different than the SVGAPI that JSR 287 defines, together they
give you broad latitude in your application's UI design.
The MMAPI uses a paradigm reminiscent of the MVC paradigm many user-interface
frameworks, such as Swing, use. Your application obtains or provides a data source to a
player that can be controlled by one or more controls that can affect media rendering
in some way. The MMAPI provides a Manager class, which provides individual Player
instances given a data source such as an InputStream instance or a locator that specifies
a data source's location. Locators can point to media on the device, from a sensor on the
device, or from off the device on the network; many (but not all) MMAPI implementa-
tions support some form of remote media access via HTTP, RTP, or RTSP.
Because the MMAPI provides a Java ME wrapper around dedicated hardware
resources, applications that use the MMAPI need to consider carefully when to use those
resources. The Player interface implements a state machine that helps restrict access to
limited resources on the device. A Player object can be in one of five states: unrealized,
realized, prefetched, started, or closed. The Player interface provides methods to transi-
tion through these states; typically your application will create a Player instance and only
invoke its realize and prefetch methods just before starting playback with start .
MMAPI Player objects are also factories for Control objects; a Control may mutate
the behavior of a Player (such as by adjusting its volume) or may provide additional func-
tionality, such as an interface from which to gain a user-interface component you can
use to show video in a media file. Capture from audio and video sensors on a Java device
works this way; you specify a locator for the device, and you can use the VideoControl 's
getSnapshot method to obtain an image snapshot, or the RecordControl class to record a
stream of audio or video data. Not all devices support audio or video capture, however,
and the Java ME runtime provides system properties that enumerate precisely what
media types and what sensors a specific MMAPI implementation supports. Some devices
may support JSR 234, which defines additional Control subclasses you can use with the
MMAPI to control capture sensors as well as perform additional multimedia operations.
The SVGAPI, on the other hand, supports the SVG Tiny 1.2 standard defined by the
W3C. Using SVG, you can define images—static or animated—that appear clear and
unpixellated at nearly any rendering size. In fact, you can build whole parts of your applica-
tion's user interface by specifying events within your SVG document that your Java ME
application can receive in response to user operations, such as focus changes. SVG is based
on XML; many vector-based drawing programs support this widely adopted standard,
making it widely available to mobile content developers. Through its packages and classes,
the SVGAPI has a rich set of features, including the ability to access portions of an SVG doc-
ument through the SVG DOM. In fact, you can even create SVG images on the fly, letting
users create new SVG images right on the Java ME device from within your application.
 
Search WWH ::




Custom Search