The Mobile Media API

The Mobile Media API

By Mark Balbes, Ph.D., OCI Principal Software Engineer

September 2004


For the last several years, OCI has published a series of articles describing the basic features of the Java 2 Micro Edition (J2ME) including the Connnected Limited Device Configuration (CLDC) and Mobile Information Device Profile (MIDP). At the time, J2ME and CLDC/MIDP were, for all practical purposes, synonymous. Although many new features of J2ME were in the Java Community Process (JCP) pipeline, including a much-desired and now never-to-be PDA Profile, they were slow to materialize.

At last, J2ME is maturing at the same rate as the rest of Java. The JCP, which defines the new features that will become part of the Java platform, now features 62 specifications relating to features for J2ME. Of these, two of the most important are the Wireless Messaging API (WMA) and the Mobile Media API (MMAPI). WMA was discussed in a previous SETT article. In this article, we focus on the Mobile Media API and provide examples for audio and video playback.

MMAPI provides a generic mechanism to play back and record time-based media such as audio and video. An arbitrary number of formats can be supported, limited only by the capabilities of the device. The API is designed in such a way as to be extensible to new formats as they become available while at the same time supporting devices with limited abilities.

Many of the ideas for MMAPI are taken from the Java Media Framework (JMF) but MMAPI is not a subset of JMF. Whereas JMF targets full-featured Java 2 Standard Edition (J2SE) systems, MMAPI is aimed at the much more limited CLDC environment. MMAPI can support others configurations including the Connected Device Configuration (CDC) but the goal is to enable a rich audio and video experience on mobile devices. Therefore, memory efficiency and performance are two key design goals of the MMAPI expert group.

In other words, MMAPI allows us to develop software for mobile phones, pagers, and PDAs that can play and record both audio and video. In addition, cameras are supported so that a photograph can be captured and used within a custom application. In reality, MMAPI-enabled devices are still new on the market and since the API is targeted at limited devices, not all functionality is required. For example, it appears to be quite common now for mobile phones to support still-image capture but not full motion video.

There are several devices on the market that already include support for MMAPI. For a listing, see the J2ME Devices page.

General Concepts

MMAPI provides functionality through two basic abstractions. The first abstraction is the notion of a data source while the second abstraction is that of a media player.

A data source can be anything that provides media data, including files, resources in the classpath, or a network connection that provides streaming data. The specific formats that are supported are determined by the device that you use. MMAPI specifies Java properties that list which formats are available. The properties are audio.encodings and video.encodings. Each contains a space-separated list of supported formats.

Although MMAPI contains an abstract DataSource class, you do not create it directly. Instead, you must specify a valid URL that points to the data. See Table 1 for examples of valid data source URLs.

In addition to a data source, MMAPI provides access to one or more appropriate players. These players provide general functionality like starting and stopping playback, and can indicate the duration of the recording.

MMAPI is also capable of recording audio and video as well as capturing images from a camera. Example URLs for audio and video playback and capture are listed in Table 1.

Table 1: Examples of Data Source URLs 

Audio playback: http://webserver/music.mp3
Video playback: http://webserver/movie.mpg
Audio capture: capture://audio
Video capture: capture://video


The Mobile Media API provides a limited set of classes and interfaces. In fact, only 4 classes are defined along with 18 interfaces. As is becoming common with Java APIs, the specific implementations of the interfaces are provided by the device manufacturer. Table 2 provides a brief description of each class and Table 3 describes each interface. The more important classes and interfaces are in bold.


Table 2: MMAPI Classes 

ContentDescriptor Describes the content type for the media.
DataSource Used in conjunction with one or more source streams to provide data for a player.
Manager* Entry point into MMAPI. The Manager creates a Player that can be use to play back or record media.
MediaException* Thrown by various methods to indicate an error condition.
* = included in MIDP 2.0


Table 3: MMAPI Interfaces 

Control* Marker interface for a class that provides specific functionality to manipulate the medium. For example, a VolumeControl can be used to raise or lower the volume on audio or video playback.
Controllable* Implemented by any class that can be controlled. Player is a Controllable because it can provide, for example, a VolumeControl and a GUIControl.
FramePositioningControl Allows for precise frame positioning of video, for example by seeking to a frame number
GUIControl Provides a GUI interface to a Player.
MetaDataControl Provides access to metadata for media, for example, the author and date.
MIDIControl Provides advanced sound control. The inclusion of a MIDIControl is optional.
PitchControl Raises or lowers the playback pitch without altering the speed or volume.
Player* Defines the lifecycle for media playback and provides access to appropriate controls.
PlayerListener* Receives asynchronous progress events from Players.
RateControl Changes the playback rate.
RecordControl Record output from a player.
SourceStream Used with a DataSource to provide more control than is available from a simple InputStream.
StopTimeControl Stops the media playback at a preset time.
TempoControl Controls the tempo of playback in millibeats per second, typically for MIDI players.
TimeBase Measures the progress of time. Multiple players can be synced to the same time base.
ToneControl* Provides a means to generate a sequence of monotonic tones.
VideoControl Controls video playback and provides an appropriate display.
VolumeControl* Changes the output volume.
* = included in MIDP 2.0

MIDP 2.0 Media API

When the members of the MIDP 2.0 expert group (JSR 118) decided to include audio support, they made sure that it was provided as a subset of the Mobile Media API. To do otherwise would fracture the Java platform, requiring similarly-featured software to be written differently depending on the platform it was to be run on.

The audio support in MIDP 2.0 includes most of the basic audio functionality discussed in this article. In fact, the audio examples in this article will run under MIDP 2.0. There is no support for video. Tables 2 and 3 indicate which MMAPI classes are supported in MIDP 2.0.

Audio Playback

Players and data sources are created by a Manager factory class. Creating a player for a given source is quite easy, even in a MIDP environment.

  1. public class PlayRecordedSoundFromWebMIDlet extends MIDlet {
  2. protected void pauseApp() {}
  3. protected void destroyApp(boolean unconditional){}
  4. protected void startApp() {
  5. try {
  6. Player player = Manager.createPlayer("http://localhost/bark.wav");
  7. player.start();
  8. } catch (MediaException e) {
  9. e.printStackTrace();
  10. } catch (IOException e) {
  11. e.printStackTrace();
  12. }
  13. }
  14. }

It is also possible to play a resource in the classpath, as the following example shows.

  1. public class PlayRecordedSoundMIDlet extends MIDlet {
  2. protected void pauseApp() {}
  3. protected void destroyApp(boolean unconditional){}
  4. protected void startApp() {
  5. try {
  6. String type = "audio/x-wav";
  7. InputStream is = getClass().getResourceAsStream("/audio/bark.wav");
  8. Player player = Manager.createPlayer(is, type);
  9. player.start();
  10. } catch (MediaException e) {
  11. e.printStackTrace();
  12. } catch (IOException e) {
  13. e.printStackTrace();
  14. }
  15. }
  16. }

Audio Tone Generation

In addition to playing prerecorded sounds, MMAPI-enabled devices can also generate tones dynamically. This means that we can now write applications that generate sound. For example, it would be relatively straight-forward to create a music composer application.

Tone generation can be accomplished in one of two ways. For simple situations, the Manager.playTone(int note, int duration, int volume) method is used. For more fine-grained control or to play a sequence of tones, a ToneControl must be created from a Player.

Here is a simple MIDlet that generates a single tone. Notice that the MIDlet destroys itself after playing the tone so that it can be invoked multiple times.

  1. import*;
  2. import javax.microedition.midlet.MIDlet;
  4. public class SimpleToneMIDlet extends MIDlet {
  5. protected void startApp() {
  6. try {
  7. Manager.playTone(50, 500, 100);
  8. this.notifyDestroyed();
  9. } catch (MediaException e) {
  10. e.printStackTrace();
  11. }
  12. }
  14. protected void pauseApp() {}
  15. protected void destroyApp(boolean unconditional){}
  16. }

A tone sequence must be defined following the format specified in the MMAPI. Unfortunately, once a player is realized, the tone sequence cannot be changed. In the example below, we play the same tones 5 times, each time at a slightly lower volume. Click here to see the full source code.

First we must create the tone sequence. Since the sequence has the volume embedded in it, we must recreate the sequence with the new volume each time we play it. Before creating the sequence, however, we'll define some useful constants corresponding to some of the notes on the scale.

  1. private byte TEMPO = 30;
  2. private byte volume = 100;
  3. private byte d = 8; // eighth note
  4. private byte C = ToneControl.C4;
  5. private byte D = (byte) (C + 2);
  6. private byte E = (byte) (C + 4);
  8. private byte[] createSequence() {
  9. byte[] sequence = {
  10. ToneControl.VERSION, 1, // always 1
  11. ToneControl.TEMPO, TEMPO, // set the tempo; in WTK 2.1 only a value of 30 seems to work
  12. ToneControl.SET_VOLUME, volume, // Set the new volume
  13. ToneControl.BLOCK_START, 0, // define block 0
  14. C, d, D, d, E, d, // define repeatable block of 3 eighth notes
  15. ToneControl.BLOCK_END, 0, // end block 0
  16. ToneControl.PLAY_BLOCK, 0, // play block 0
  17. ToneControl.SILENCE, d, E, d, D, d, C, d, // play some other notes
  18. ToneControl.PLAY_BLOCK, 0, // play block 0 again
  19. };
  20. return sequence;
  21. }

To play the sequence, we must do a little more work than to play a simple tone. Instead, a tone player is used to create a ToneControl. The tone sequence must then be defined in this control before the Player enters the prefetched or started state. That is, the player cannot acquired all of its resources before the tone sequence is defined.

In the example code below we have added a PlayerListener to the player. This notifies the listener as the player transitions through its states.

  1. private void play() throws IOException, MediaException {
  2. form.append("Quiet!");
  3. p = Manager.createPlayer(Manager.TONE_DEVICE_LOCATOR);
  4. p.addPlayerListener(this);
  5. p.realize();
  6. c = (ToneControl) p.getControl("ToneControl");
  7. c.setSequence(createSequence());
  8. p.start();
  9. }

The PlayerListener interface requires a playerUpdate method to be defined. In this example, the method uses the END_OF_MEDIA event to trigger the creation of another player with a tone sequence set with a lower volume. It would have been nice to be able to reuse the same Playerwith the different sequences. However, the lifecycle of a Player does not permit this. Once a Player has entered the prefetched or started state, it cannot transition back to a state earlier than prefetched. In other words, a Player can only be used to play one data source. It can replay the same source but it cannot be reused with a different source.

  1. public void playerUpdate(Player player, String event, Object eventData) {
  2. p.close(); // release the resources
  3. if (event == PlayerListener.END_OF_MEDIA && volume > 10) {
  4. try {
  5. volume /= 2;
  6. play(volume);
  7. } catch (IOException e1) {
  8. e1.printStackTrace();
  9. } catch (MediaException e) {
  10. e.printStackTrace();
  11. }
  12. } else if (volume < 10) {
  13. notifyDestroyed();
  14. }
  15. }

Video Playback

It is relatively straight-forward to play video using similar techniques as for audio. The following code snippet shows how to display full-screen video. (To view the code for the entire application, click here.) Figure 1 shows a full screen display in the J2ME Wireless Toolkit. We must provide a Canvas for the video to be drawn in. Since Canvas is abstract, we define a concrete subclass which does not do anything. The application constructs a player by specifying a valid URL. It then adds a PlayerListener to the player. This listener will be informed of state changes in the player including when the video clip has reached the end of the media.

Before proceeding with the time consuming task of realizing the player, the application first displays a busy screen to the user. Once the player has been realized, meaning that the necessary resources have been allocated and fetched, the application asks the player for a video control that is appropriate for the platform on which it is running. We specify that the video is to be displayed in full-screen mode on the Canvas using VideoControl.initDisplayMode().

  1. private void showFullScreenDisplay() {
  2. Canvas canvas = new Canvas() {
  3. protected void paint(Graphics g) {
  4. // We won't draw anything ourselves
  5. }
  6. };
  7. Display.getDisplay(this).setCurrent(canvas);
  8. try {
  9. videoPlayer = Manager.createPlayer("http://localhost/Rabbit2004.mpg");
  10. videoPlayer.addPlayerListener(this);
  11. Display.getDisplay(this).setCurrent(busyForm);
  12. videoPlayer.realize();
  13. VideoControl videoControl = (VideoControl)videoPlayer.getControl("VideoControl");
  14. videoControl.initDisplayMode(VideoControl.USE_DIRECT_VIDEO, canvas);
  15. videoControl.setVisible(true);
  16. videoPlayer.start();
  17. } catch (IOException e) {
  18. e.printStackTrace();
  19. } catch (MediaException e) {
  20. e.printStackTrace();
  21. }
  22. }

In addition to full-screen video, we can display video on a portion of the screen as part of a more feature-rich user interface. Figure 1 shows an embedded video display in the J2ME Wireless Toolkit. The example below is only slightly different from the full-screen example. Here, the VideoControl.initDisplayMode() method creates an Item which can be placed on a Form for display. The Item class is a GUI component defined in the javax.microedition.lcdui MIDP GUI package. Because this example runs only in a MIDP environment, the second parameter to VideoControl.initDisplayMode() can be null. In an AWT environment, the returned display would be a java.awt.Component. In a mixed environment where both AWT and lcdui are supported, the second parameter must specify either "java.awt.Component"; or "javax.microedition.lcdui.Item".

  1. private void showDisplayInWindow() {
  2. Form videoForm = new Form("Please watch");
  3. videoForm.append("Check this out!");
  4. try {
  5. videoPlayer = Manager.createPlayer("http://localhost/Rabbit2004.mpg");
  6. videoPlayer.addPlayerListener(this);
  7. Display.getDisplay(this).setCurrent(busyForm);
  8. videoPlayer.realize();
  9. VideoControl videoControl = (VideoControl)videoPlayer.getControl("VideoControl");
  10. if (videoControl != null) {
  11. Item videoItem =(Item)videoControl.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, null);
  12. videoControl.setDisplaySize(75, 75);
  13. videoForm.append(videoItem);
  14. videoForm.append("Watch above");
  15. Display.getDisplay(this).setCurrent(videoForm);
  16. }
  18. videoPlayer.start();
  19. } catch (IOException e) {
  20. e.printStackTrace();
  21. } catch (MediaException e) {
  22. e.printStackTrace();
  23. }
  24. }

Figure 1: Examples of full-screen and embedded video

Wireless Toolkit Support

Now that you have whetted your appetite with this article, you will no doubt want to start programming with the Mobile Media API. One easy way to do this is to use the Sun J2ME Wireless Toolkit 2.1. This latest version has support for many J2ME optional packages including MMAPI as well as support for MIDP 2.0. The toolkit is configurable so that J2ME applications can be tested in a MIDP 1.0 or MIDP 2.0 environment, with or without MMAPI.

To run the examples used in this article, download the source code. Unzip the source code to C:\WTK21\apps (where WTK21 is the installation directory for the wireless toolkit). Place the Rabbit2004.mpg and bark.wav files onto your local web server. If your web server is running on a different machine, you will have to modify the example code to point to it. Start the wireless toolkit KToolbar and open either the MMAPIAudioDemo or MMAPIVideoDemo project. Then build and run the demos.


The Mobile Media API provides software developers with tremendous opportunities to enhance applications with sound and video never before available on small devices. This opens the door to creating a much richer experience for the end user, whether it be for gaming, business, productivity or other, more specialized applications.

As always, when dealing with J2ME across many different devices with different form factors from multiple vendors, it is always important to test your application on the targeted deployment device. Since much of MMAPI is optional and media formats may or may not be supported, it becomes especially important to test under real deployment conditions.


Software Engineering Tech Trends (SETT) is a regular publication featuring emerging trends in software engineering.