QtMultimediaKit provides a set of APIs that allow the developer to play, record and manage a collection of media content. It is dependent on the QtMultimedia module. QtMultimediaKit is the recommended API to build multimedia applications using Qt. The Phonon API is no longer recommended.
Unlike the other APIs in QtMobility, the Multimedia API is not in the QtMobility namespace.
This API delivers an easy to use interface to multimedia functions. The developer can use the API to display an image, or a video, record sound or play a multimedia stream.
There are several benefits this API brings to Qt. Firstly, the developer can now implement fundamental multimedia functions with minimal code, mostly because they are already implemented. Also there is a great deal of flexibility with the media source or the generated multimedia. The source file does not need to be local to the device, it could be streamed from a remote location and identified by a URL. Finally, many different codecs are supported 'out of the box'.
The supplied examples give a good idea at the ease of use of the API. When the supporting user interface code is ignored we can see that functionality is immediately available with minimal effort.
The Audio Recorder example is a good introduction to the basic use of the API. We will use snippets from this example to illustrate how to use the API to quickly build functionality.
The first step is to demonstrate recording audio to a file. When recording from an audio source there are a number of things we may want to control beyond the essential user interface. We may want a particular encoding of the file, MP3 or Ogg Vorbis for instance, or select a different input source. The user may modify the bitrate, number of channels, quality and sample rate. Here the example will only modify the codec and the source device, since they are essential.
To begin, the developer sets up a source and a recorder object. A QAudioCaptureSource object is created and used to initialize a QMediaRecorder object. The output file name is then set for the QMediaRecorder object.
audiosource = new QAudioCaptureSource; capture = new QMediaRecorder(audiosource); capture->setOutputLocation(QUrl("test.raw"));
A list of devices is needed so that an input can be selected in the user interface
for(int i = 0; i < audiosource->deviceCount(); i++) deviceBox->addItem(audiosource->name(i));
and a list of the supported codecs for the user to select a codec,
QStringList codecs = capture->supportedAudioCodecs(); for(int i = 0; i < codecs.count(); i++) codecsBox->addItem(codecs.at(i));
To set the selected device or codec just use the index of the device or codec by calling the setter in audiosource or capture as appropriate, for example,
audiosource->setSelectedDevice(i); ... capture->setAudioCodec(codecIdx);
Now start recording by using the record() function from the new QMediaRecorder object
capture->record();
And stop recording by calling the matching function stop() in QMediaRecorder.
capture->stop();
How then would this audio file be played? The QMediaPlayer class will be used as a generic player. Since the player can play both video and audio files the interface will be more complex, but for now the example will concentrate on the audio aspect.
Playing the file is simple: create a player object, pass in the filename, set the volume or other parameters, then play. Not forgetting that the code will need to be hooked up to the user interface.
QMediaPlayer *player = new QMediaPlayer; ... player->setMedia(QUrl::fromLocalFile("test.raw")); player->setVolume(50); player->play();
The filename does not have to be a local file. It could be a URL to a remote resource. Also by using the QMediaPlaylist class from this API we can play a list of local or remote files. The QMediaPlaylist class supports constructing, managing and playing playlists.
player = new QMediaPlayer; playlist = new QMediaPlaylist(player); playlist->addMedia(QUrl("http://example.com/myfile1.mp3")); playlist->addMedia(QUrl("http://example.com/myfile2.mp3")); ... playlist->setCurrentPosition(1); player->play();
To manipulate the playlist there are the usual management functions (which are in fact slots): previous, next, setCurrentPosition and shuffle. Playlists can be built, saved and loaded using the API.
Continuing with the example discussed for an Audio recorder/player, we can use this to show how to play video files with little change to the code.
Moving from audio to video requires few changes in the sample code. To play a video playlist the code can be changed to include another new QtMobility Project class: QVideoWidget. This class enables control of a video resource with signals and slots for the control of brightness, contrast, hue, saturation and full screen mode.
player = new QMediaPlayer; playlist = new QMediaPlaylist(player); playlist->addMedia(QUrl("http://example.com/myclip1.mp4")); playlist->addMedia(QUrl("http://example.com/myclip2.mp4")); ... widget = new QVideoWidget(player); widget->show(); playlist->setCurrentPosition(1); player->play();
The Player example does things a bit differently to our sample code. Instead of using a QVideoWidget object directly, the Player example has a VideoWidget class that inherits from QVideoWidget. This means that functions can be added to provide functions such as full screen display, either on a double click or on a particular keypress.
videoWidget = new VideoWidget(this); player->setVideoOutput(videoWidget); playlistModel = new PlaylistModel(this); playlistModel->setPlaylist(playlist);
Creating still images and video.
In order to capture an image we need to create a QCamera object and use it to initialize a QVideoWidget, so we can see where the camera is pointing - a viewfinder. The camera object is also used to initialize a new QCameraImageCapture object, imageCapture. All that is then needed is to start the camera, lock it so that the settings are not changed while the image capture occurs, capture the image, and finally unlock the camera ready for the next photo.
camera = new QCamera; viewFinder = new QCameraViewfinder(); viewFinder->show(); camera->setViewfinder(viewFinder); imageCapture = new QCameraImageCapture(camera); camera->setCaptureMode(QCamera::CaptureStillImage); camera->start(); //on half pressed shutter button camera->searchAndLock(); ... //on shutter button pressed imageCapture->capture(); //on shutter button released camera->unlock();
Note: Alternatively, we could have used a QGraphicsVideoItem as a viewfinder.
Previously we saw code that allowed the capture of a still image. Recording video requires the use of a QMediaRecorder object and a QAudioCaptureSource for sound.
To record video we need a camera object, as before, a media recorder and a viewfinder object. The media recorder object will need to be initialized.
camera = new QCamera; mediaRecorder = new QMediaRecorder(camera); camera->setCaptureMode(QCamera::CaptureVideo); camera->start(); //on shutter button pressed mediaRecorder->record();
Signals from the mediaRecorder can be connected to slots to react to changes in the state of the recorder or error events. Recording itself starts with the record() function of mediaRecorder being called, this causes the signal stateChanged() to be emitted. The recording process can be changed with the record(), pause(), stop() and setMuted() slots in QMediaRecorder.
When the camera is in video mode, as decided by the application, then as the shutter button is pressed the camera is locked as before but instead the record() function in QMediaRecorder is used.
Focusing is managed by the classes QCameraFocus and QCameraFocusControl. QCameraFocus allows the developer to set the general policy by means of the enums for the FocusMode and the FocusPointMode. FocusMode deals with settings such as AutoFocus, ContinuousFocus and InfinityFocus, whereas FocusPointMode deals with the various focus zones within the view. FocusPointMode has support for face recognition, center focus and a custom focus where the focus point can be specified.
Various operations such as image capture and auto focusing occur asynchrously. These operations can often be cancelled by the start of a new operation as long as this is supported by the backend. For image capture, the operation can be cancelled by calling cancelCapture(). For auto-focus, auto-exposure or white balance cancellation can be done by calling unlock(QCamera::LockFocus).
AudioRecorder is a demonstration of the discovery of the supported devices and codecs and the use of recording functions in the QMediaRecorder class.
The Player example is a simple multimedia player. Select a video file to play, stop, pause, show in fullscreen or manipulate various image attributes using the Color Options button.
The Slide Show shows the use of the QMediaImageViewer and QVideoWidget classes.
The Camera Example shows how use the QtMultimediaKit API to quickly write a camera application in C++.
The QML Camera Example demonstrates still image capture and controls using the QML plugin. Video recording is not currently available.
The QML Video Example demonstrates the various manipulations (move; resize; rotate; change aspect ratio) which can be applied to QML Video and Camera items.
It also shows how native code can be combined with QML to implement more advanced functionality - in this case, C++ code is used to calculate the QML frame rate and (on Symbian) the graphics memory consumption; these metrics are rendered in QML as semi-transparent items overlaid on the video content.
The QML Video Shader Effects Example shows how the ShaderEffectItem element can be used to apply postprocessing effects, expressed in GLSL, to QML Video and Camera items.
It re-uses the frame rate and memory consumption display code used by the QML Video Example.
Finally, this application demonstrates the use of different top-level QML files to handle different physical screen sizes. On small-screen devices, menus are by default hidden, and only appear when summoned by a gesture. Large-screen devices show a more traditional layout in which menus are displayed around the video content pane.
Contains enums used by the audio classes | |
Interface to query and select an audio input endpoint | |
Interface to query audio devices and their functionality | |
Set of audio encoder settings | |
Audio endpoint selector media control | |
Stores audio parameter information | |
Interface for receiving audio data from an audio input device | |
Interface for sending audio data to an audio output device | |
Graphics item which display video produced by a QMediaObject | |
The base class for objects extending media objects functionality | |
Access to the resources relating to a media content | |
Means of viewing image media | |
Common base for multimedia objects | |
Allows the playing of a media source | |
List of media content to play | |
Navigation for a media playlist | |
Used for the recording of media content | |
Description of a media resource | |
Represents a time interval with integer precision | |
Represents a set of zero or more disjoint time intervals | |
Interface to the systems analog radio device | |
Widget which presents video produced by a media object | |
Contains miscellaneous identifiers used throughout the Qt Media services library |
Interface for system camera devices | |
Interface for exposure related camera settings | |
Interface for focus and zoom related camera settings | |
Used for the recording of media content | |
Interface for focus and zoom related camera settings | |
Camera viewfinder widget |
For developers wishing to access some platform specific settings, or to port the Qt Multimedia APIs to a new platform or technology, see Multimedia Backend Development.
On Symbian, the QVideoRendererControl class may provide video frames in one of two forms:
Which of these paths is available depends on the version of the Symbian platform, and on the source of the video data:
Where multiple paths are available, the default can be overridden by setting the "_q_eglRenderingAllowed" property on the QMediaService object. If this property is true and the "EGL" path is available, it is used. Otherwise the "software" path is used.
// create a camera whose viewfinder may render via EGL camera = new QCamera; camera->service()->setProperty("_q_eglRenderingAllowed", true);
Note that, for rendering video frames to the screen, the QGraphicsVideoItem implementation uses the most efficient route available (which is never the "software" path). Selection of the correct rendering path is done automatically and is transparent to the client: