QtMobility Reference Documentation

Multimedia

QtMultimediaKit provides a set of APIs that allow the developer to play, record and manage a collection of media content. It is dependent on the QtMultimedia module. QtMultimediaKit is the recommended API to build multimedia applications using Qt. The Phonon API is no longer recommended.

No Special Namespace

Unlike the other APIs in QtMobility, the Multimedia API is not in the QtMobility namespace.

Overview

This API delivers an easy to use interface to multimedia functions. The developer can use the API to display an image, or a video, record sound or play a multimedia stream.

There are several benefits this API brings to Qt. Firstly, the developer can now implement fundamental multimedia functions with minimal code, mostly because they are already implemented. Also there is a great deal of flexibility with the media source or the generated multimedia. The source file does not need to be local to the device, it could be streamed from a remote location and identified by a URL. Finally, many different codecs are supported 'out of the box'.

The supplied examples give a good idea at the ease of use of the API. When the supporting user interface code is ignored we can see that functionality is immediately available with minimal effort.

Audio

The Audio Recorder example is a good introduction to the basic use of the API. We will use snippets from this example to illustrate how to use the API to quickly build functionality.

The first step is to demonstrate recording audio to a file. When recording from an audio source there are a number of things we may want to control beyond the essential user interface. We may want a particular encoding of the file, MP3 or Ogg Vorbis for instance, or select a different input source. The user may modify the bitrate, number of channels, quality and sample rate. Here the example will only modify the codec and the source device, since they are essential.

To begin, the developer sets up a source and a recorder object. A QAudioCaptureSource object is created and used to initialize a QMediaRecorder object. The output file name is then set for the QMediaRecorder object.

 audiosource = new QAudioCaptureSource;
 capture = new QMediaRecorder(audiosource);

 capture->setOutputLocation(QUrl("test.raw"));

A list of devices is needed so that an input can be selected in the user interface

 for(int i = 0; i < audiosource->deviceCount(); i++)
     deviceBox->addItem(audiosource->name(i));

and a list of the supported codecs for the user to select a codec,

 QStringList codecs = capture->supportedAudioCodecs();
 for(int i = 0; i < codecs.count(); i++)
     codecsBox->addItem(codecs.at(i));

To set the selected device or codec just use the index of the device or codec by calling the setter in audiosource or capture as appropriate, for example,

 audiosource->setSelectedDevice(i);
 ...
 capture->setAudioCodec(codecIdx);

Now start recording by using the record() function from the new QMediaRecorder object

 capture->record();

And stop recording by calling the matching function stop() in QMediaRecorder.

 capture->stop();

How then would this audio file be played? The QMediaPlayer class will be used as a generic player. Since the player can play both video and audio files the interface will be more complex, but for now the example will concentrate on the audio aspect.

Playing the file is simple: create a player object, pass in the filename, set the volume or other parameters, then play. Not forgetting that the code will need to be hooked up to the user interface.

 QMediaPlayer *player = new QMediaPlayer;
 ...
 player->setMedia(QUrl::fromLocalFile("test.raw"));
 player->setVolume(50);
 player->play();

The filename does not have to be a local file. It could be a URL to a remote resource. Also by using the QMediaPlaylist class from this API we can play a list of local or remote files. The QMediaPlaylist class supports constructing, managing and playing playlists.

 player = new QMediaPlayer;

 playlist = new QMediaPlaylist(player);
 playlist->addMedia(QUrl("http://example.com/myfile1.mp3"));
 playlist->addMedia(QUrl("http://example.com/myfile2.mp3"));
 ...
 playlist->setCurrentPosition(1);
 player->play();

To manipulate the playlist there are the usual management functions (which are in fact slots): previous, next, setCurrentPosition and shuffle. Playlists can be built, saved and loaded using the API.

Video

Continuing with the example discussed for an Audio recorder/player, we can use this to show how to play video files with little change to the code.

Moving from audio to video requires few changes in the sample code. To play a video playlist the code can be changed to include another new QtMobility Project class: QVideoWidget. This class enables control of a video resource with signals and slots for the control of brightness, contrast, hue, saturation and full screen mode.

 player = new QMediaPlayer;

 playlist = new QMediaPlaylist(player);
 playlist->addMedia(QUrl("http://example.com/myclip1.mp4"));
 playlist->addMedia(QUrl("http://example.com/myclip2.mp4"));
 ...
 widget = new QVideoWidget(player);
 widget->show();

 playlist->setCurrentPosition(1);
 player->play();

The Player example does things a bit differently to our sample code. Instead of using a QVideoWidget object directly, the Player example has a VideoWidget class that inherits from QVideoWidget. This means that functions can be added to provide functions such as full screen display, either on a double click or on a particular keypress.

     videoWidget = new VideoWidget(this);
     player->setVideoOutput(videoWidget);

     playlistModel = new PlaylistModel(this);
     playlistModel->setPlaylist(playlist);

Camera Support

Creating still images and video.

Still Images

In order to capture an image we need to create a QCamera object and use it to initialize a QVideoWidget, so we can see where the camera is pointing - a viewfinder. The camera object is also used to initialize a new QCameraImageCapture object, imageCapture. All that is then needed is to start the camera, lock it so that the settings are not changed while the image capture occurs, capture the image, and finally unlock the camera ready for the next photo.

         camera = new QCamera;
         viewFinder = new QCameraViewfinder();
         viewFinder->show();

         camera->setViewfinder(viewFinder);

         imageCapture = new QCameraImageCapture(camera);

         camera->setCaptureMode(QCamera::CaptureStillImage);
         camera->start();

         //on half pressed shutter button
         camera->searchAndLock();

         ...

         //on shutter button pressed
         imageCapture->capture();

         //on shutter button released
         camera->unlock();

Note: Alternatively, we could have used a QGraphicsVideoItem as a viewfinder.

Video Clips

Previously we saw code that allowed the capture of a still image. Recording video requires the use of a QMediaRecorder object and a QAudioCaptureSource for sound.

To record video we need a camera object, as before, a media recorder and a viewfinder object. The media recorder object will need to be initialized.

 camera = new QCamera;
 mediaRecorder = new QMediaRecorder(camera);

 camera->setCaptureMode(QCamera::CaptureVideo);
 camera->start();

 //on shutter button pressed
 mediaRecorder->record();

Signals from the mediaRecorder can be connected to slots to react to changes in the state of the recorder or error events. Recording itself starts with the record() function of mediaRecorder being called, this causes the signal stateChanged() to be emitted. The recording process can be changed with the record(), pause(), stop() and setMuted() slots in QMediaRecorder.

When the camera is in video mode, as decided by the application, then as the shutter button is pressed the camera is locked as before but instead the record() function in QMediaRecorder is used.

Focus

Focusing is managed by the classes QCameraFocus and QCameraFocusControl. QCameraFocus allows the developer to set the general policy by means of the enums for the FocusMode and the FocusPointMode. FocusMode deals with settings such as AutoFocus, ContinuousFocus and InfinityFocus, whereas FocusPointMode deals with the various focus zones within the view. FocusPointMode has support for face recognition, center focus and a custom focus where the focus point can be specified.

Canceling Asynchronous Operations

Various operations such as image capture and auto focusing occur asynchrously. These operations can often be cancelled by the start of a new operation as long as this is supported by the backend. For image capture, the operation can be cancelled by calling cancelCapture(). For auto-focus, auto-exposure or white balance cancellation can be done by calling unlock(QCamera::LockFocus).

Platform Notes

Examples

Record a Sound Source

AudioRecorder is a demonstration of the discovery of the supported devices and codecs and the use of recording functions in the QMediaRecorder class.

Play a Media File

The Player example is a simple multimedia player. Select a video file to play, stop, pause, show in fullscreen or manipulate various image attributes using the Color Options button.

Slide Show

The Slide Show shows the use of the QMediaImageViewer and QVideoWidget classes.

Camera Example

The Camera Example shows how use the QtMultimediaKit API to quickly write a camera application in C++.

QML Camera Example

The QML Camera Example demonstrates still image capture and controls using the QML plugin. Video recording is not currently available.

QML Video Example

The QML Video Example demonstrates the various manipulations (move; resize; rotate; change aspect ratio) which can be applied to QML Video and Camera items.

It also shows how native code can be combined with QML to implement more advanced functionality - in this case, C++ code is used to calculate the QML frame rate and (on Symbian) the graphics memory consumption; these metrics are rendered in QML as semi-transparent items overlaid on the video content.

QML Video Shader Effects Example

The QML Video Shader Effects Example shows how the ShaderEffectItem element can be used to apply postprocessing effects, expressed in GLSL, to QML Video and Camera items.

It re-uses the frame rate and memory consumption display code used by the QML Video Example.

Finally, this application demonstrates the use of different top-level QML files to handle different physical screen sizes. On small-screen devices, menus are by default hidden, and only appear when summoned by a gesture. Large-screen devices show a more traditional layout in which menus are displayed around the video content pane.

Reference documentation

Main audio and video classes

QAudio

Contains enums used by the audio classes

QAudioCaptureSource

Interface to query and select an audio input endpoint

QAudioDeviceInfo

Interface to query audio devices and their functionality

QAudioEncoderSettings

Set of audio encoder settings

QAudioEndpointSelector

Audio endpoint selector media control

QAudioFormat

Stores audio parameter information

QAudioInput

Interface for receiving audio data from an audio input device

QAudioOutput

Interface for sending audio data to an audio output device

QGraphicsVideoItem

Graphics item which display video produced by a QMediaObject

QMediaBindableInterface

The base class for objects extending media objects functionality

QMediaContent

Access to the resources relating to a media content

QMediaImageViewer

Means of viewing image media

QMediaObject

Common base for multimedia objects

QMediaPlayer

Allows the playing of a media source

QMediaPlaylist

List of media content to play

QMediaPlaylistNavigator

Navigation for a media playlist

QMediaRecorder

Used for the recording of media content

QMediaResource

Description of a media resource

QMediaTimeInterval

Represents a time interval with integer precision

QMediaTimeRange

Represents a set of zero or more disjoint time intervals

QRadioTuner

Interface to the systems analog radio device

QVideoWidget

Widget which presents video produced by a media object

QtMultimediaKit

Contains miscellaneous identifiers used throughout the Qt Media services library

Camera classes

QCamera

Interface for system camera devices

QCameraExposure

Interface for exposure related camera settings

QCameraFocus

Interface for focus and zoom related camera settings

QCameraImageCapture

Used for the recording of media content

QCameraImageProcessing

Interface for focus and zoom related camera settings

QCameraViewfinder

Camera viewfinder widget

Advanced usage.

For developers wishing to access some platform specific settings, or to port the Qt Multimedia APIs to a new platform or technology, see Multimedia Backend Development.

QML Elements

Video renderer selection on Symbian

On Symbian, the QVideoRendererControl class may provide video frames in one of two forms:

  • "software" - as a QPixmap which is backed by a CFbsBitmap. In this case, the pixel data is resident in CPU-addressable memory, so the client can access pixels directly via QVideoFrame::bits(). QVideoFrame::handleType() returns QAbstractVideoBuffer::NoHandle when the video frame is in this form.
  • "EGL" - as an EGLImageKHR handle. In this case, the pixel data is resident in GPU memory. It therefore cannot be accessed by the client, but may be rendered to the screen more efficiently than video frames obtained via the "software" path.

Which of these paths is available depends on the version of the Symbian platform, and on the source of the video data:

  • On Nokia Belle Feature Pack 1 and earlier: For QCamera, only the "software" path is available. For QMediaPlayer, neither path is available, so QVideoRendererControl is not supported.
  • After Nokia Belle Feature Pack 1: For QCamera, both paths are available; the "software" path is used by default. For QMediaPlayer, only the "EGL" path is available.

Where multiple paths are available, the default can be overridden by setting the "_q_eglRenderingAllowed" property on the QMediaService object. If this property is true and the "EGL" path is available, it is used. Otherwise the "software" path is used.

 // create a camera whose viewfinder may render via EGL
 camera = new QCamera;
 camera->service()->setProperty("_q_eglRenderingAllowed", true);

Note that, for rendering video frames to the screen, the QGraphicsVideoItem implementation uses the most efficient route available (which is never the "software" path). Selection of the correct rendering path is done automatically and is transparent to the client:

  • For fullscreen, untransformed video items, QVideoWidgetControl is used, meaning that video frames go directly to the display, and the client thread is not notified at all.
  • For non-fullscreen or transformed video items: If the "EGL" path is available and a hardware-accelerated paint engine is in use, the "EGL" path is used. Otherwise QVideoWidgetControl is used.
X

Thank you for giving your feedback.

Make sure it is related to this specific page. For more general bugs and requests, please use the Qt Bug Tracker.