Please note that as of October 24, 2014, the Nokia Developer Wiki will no longer be accepting user contributions, including new entries, edits and comments, as we begin transitioning to our new home, in the Windows Phone Development Wiki. We plan to move over the majority of the existing entries. Thanks for all your past and future contributions.

(Difference between revisions)

Template universal app for video recording with MediaCapture using Imaging SDK Filters

From Wiki
Jump to: navigation, search
leemcpherson (Talk | contribs)
(Leemcpherson - - Summary)
leemcpherson (Talk | contribs)
(Leemcpherson - - Introduction)
Line 34: Line 34:
Many parts of this app were inspired by or even pulled directly from code provided by other developers on this site.
Many parts of this app were inspired by or even pulled directly from code provided by other developers on this site.
References coming soon....
The app described in this article can be downloaded by zip file or via GitHub here:
== C++ Universal Component ==
== C++ Universal Component ==

Revision as of 04:25, 8 June 2014

Note.pngNote: This is an entry in the Nokia Original Effect Wiki Challenge 2014Q2

This article provides an example app which is intended to be used as a template for applying filters to a real-time video with the ability to record video with the filters applied. The article will also describe key sections of code that are requirements for creating your own version of this app. The media player is loading... Recorded video using Nokia Imaging SDK with MagicPenFilter.

Template:ArticleMetaData v1.0



With the introduction of universal apps and WinRT for Windows Phone 8.1, the Silverlight/WP8.0 method of accessing the camera has changed. Within the latest Nokia Imaging SDK is a IImageProvider called CameraPreviewImageSource which allows you to process frames from a video stream before capturing a final image to process. However, unless you were to provide your own video encoder, you cannot record the video being output by the CameraPreviewImageSource.

WinRT provides a MediaCapture class which provides video preview, image capture, and video record capabilities. The preview can be output to a new XAML class called a CaptureElement. There is no traditional way to apply the Nokia Imaging SDK to the preview (or record) output of the MediaCapture class. However, the MediaCapture class will accept Media Foundation Transforms through the AddEffectAsync method. This article will describe the key sections of code that are required to add a Media Foundation Transform to the app and how to incorporate the Nokia Imaging SDk into it.

Many parts of this app were inspired by or even pulled directly from code provided by other developers on this site.

The app described in this article can be downloaded by zip file or via GitHub here:

C++ Universal Component

The Media Foundation Transform must be coded in C++ and the interfaces that must be implemented within the transform class are quite complicated. Thankfully, there exist several samples of a Media Foundation transform that do not require much modification for our purposes. A good sample to start with is the MediaExtensions sample from the Univeral Apps sample set from Microsoft. The sample contains a GrayscaleTransform component from which this app's transform is based.

Preparing the C++ Component

Add Media Foundation Libraries

Next, the Media Foundation libraries need to be added to the linker. For each C++ project (WP8.1 and Win8), right-click on the project name and select properties. Expand the Linker group and select Input. Change the Configuration (at the top) from Active to All Configurations and change Platform from Active to All Platforms so that the following changes are applied to every configuration and platform.

To the end of the "Additional Dependencies" block, add mfplat.lib and mfuuid.lib separated by a semi-colon(;).

Add Common Folder from sample & Include Directories

At the root of the sample app (this app or the MediaExtensions app) is a folder labeled Common which contains several C++ header files (AsyncCB.h, CritSec.h, ExtensionsDefs.h, etc) . These are necessary for the media transform code to compile. Copy the entire Common folder to the root of your own solution.

Next, go to the property page for each C++ project in your app and change the configuration and platform to all as before. Expand the C/C++ menu and click on General. To the end of the Addition Include Directories, add a semicolon (;) and the path "..\..\Common" so that VS2013 can find the header files we just copied.

Modify pch.h

Replace the contents of your pch.h file with the following:

#pragma once
#include <collection.h>
#include <ppltasks.h>
// Windows Header Files:
#include <windows.h>
#include <mfapi.h>
#include <mfidl.h>
#include <mfapi.h>
#include <mferror.h>
#include <d3d11.h>
#include <D2d1helper.h>
#include <assert.h>
#include <tchar.h>
#include <Strsafe.h>
#include <objidl.h>
#include <new>
#include <wrl\client.h>
#include <wrl\implements.h>
#include <wrl\ftm.h>
#include <wrl\event.h>
#include <wrl\wrappers\corewrappers.h>
#include <>
#include <ppltasks.h>
#include <ExtensionsDefs.h>
using namespace Platform;
using namespace Microsoft::WRL;
using namespace Microsoft::WRL::Wrappers;

Add the Nokia Imaging SDK

Add the Nokia Imaging SDK to all projects in the solution using NuGet. You do NOT have to manually add a reference to the Imaging SDK to the C++ projects. They will not appear as a reference under the properties, but they will be included in the project anyways.

Add NativeBuffer.h

This class is copied entirely from galazzo 's article here:

The Transform Class

To the Shared C++ project Add both a .h header and .cpp c++ file. Creating these two files from scratch is beyond the scope of this article. Instead copy the contents of the ImagingEffect.cpp and ImagingEffect.h from the sample to your project. (These files are too large to post directly in this article.) There are several functions within this class that this article will highlight in the next section so that you can modify or add to this class.

Static Transform Functions for NV12 & YUY2

Most of the work of the transform happens within the non-class functions TransformImage_NV12 and TransformImage_YUY2. These functions transform the video frame at the pointer pSrc and copy the new, transformed image to the pointer pDest. Another important variable within these functions is the pointer to a wide string, filterParams. This string contains parameters from the C# project that define which filters to use and what parameters to use with them.

auto size = Windows::Foundation::Size(dwWidthInPixels, dwHeightInPixels);
auto totalbytes = (int)dwHeightInPixels * (int)dwWidthInPixels * 2; //each macropixel of 4 bytes creates 2 pixels (YUYV)
Bitmap^ m_BitmapToProcess = AsBitmapYUY2(pSrc, (unsigned int)size.Width, (unsigned int)size.Height);
ApplyImagingFilters(m_BitmapToProcess, pDest, totalbytes, filterParams, ColorMode::Yuv422_Y1UY2V);

In order to use the Nokia Imaging SDK, the first step is to transform the raw byte data into a Bitmap. This is accomplished for the NV12 function using another function pulled directly from galazzo 's article again [[1]]. For YUY2 to Bitmap, the function was modified for the YUY2 format.

Nokia::Graphics::Imaging::Bitmap^ AsBitmapYUY2(const unsigned char* source, unsigned int width, unsigned int height)
int totalDimensionLength = width * height;
int size = totalDimensionLength * 4; //YUY2 buffer will be returned from the camera.
ComPtr<ImagingEffects::NativeBuffer> nativeBuffer;
MakeAndInitialize<ImagingEffects::NativeBuffer>(&nativeBuffer, (byte *)source, size);
auto iinspectable = (IInspectable *)reinterpret_cast<IInspectable *>(nativeBuffer.Get());
IBuffer ^buffer = reinterpret_cast<IBuffer ^>(iinspectable);
nativeBuffer = nullptr;
return ref new Bitmap(Windows::Foundation::Size((float)width, (float)height), ColorMode::Yuv422_Y1UY2V, 2*width, buffer);

ApplyingImagingFilters function

The use of the Nokia Imaging SDK in a C++ app is similar to a C# app. First, a source must be created. Then, several Effects or Filters are applied. Finally, the result is rendered to a designated target. In this app, the source is a BitmapImageSource that uses the previously created Bitmap in its constructor.

        Nokia::Graphics::Imaging::BitmapImageSource^ bis = ref new BitmapImageSource(sourceBitmap);
FilterEffect^ filterEffect = ref new FilterEffect(bis);
IVector<IFilter^>^ Filters = ref new Platform::Collections::Vector<IFilter^>();
wchar_t* token = NULL;
wchar_t *next_token = NULL;
std::wstring s = std::wstring(filterParams);
token = wcstok_s(&s[0], L";", &next_token);
while (token != NULL)
auto filter = GetFilter(token);
token = wcstok_s(NULL, L";", &next_token);
filterEffect->Filters = Filters;
auto renderer = ref new Nokia::Graphics::Imaging::BitmapRenderer(filterEffect, colorMode);
auto renderOp = renderer->RenderAsync();
auto renderTask = create_task(renderOp);
renderTask.then([pDest, totalBytes](Nokia::Graphics::Imaging::Bitmap^ bitmap)
auto count = bitmap->Buffers->Length;
auto data = FromIBuffer(bitmap->Buffers[0]->Buffer);
//auto data = GetPointerToPixelData(bitmap->Buffers[0]->Buffer);
CopyMemory(pDest, data, totalBytes);

Next, filters are added to a FilterEffect via a loop run using the filterParams parameter. This will be described in the next section.

Finally, a BitmapRenderer is created using a specified ColorMode so that the output matches the format of the input. In the Nokia Imaging SDK, NV12 matches to Yuv420Sp and YUY2 matches to Yuv422_Y1UY2V. A pointer to the raw byte data is created from the IBuffer internal to the newly created Bitmap. This byte data is then copied to the pointer, pDest.

Note.pngNote: NV12 and YUY2 are two different formats for video that uses luminance and chroma data instead of straight RGBA data. These are the only two formats used by the Nokia Lumia 920 and the LifeCam HD-5000 webcam.

Parsing the Filter Parameters & Creating the Filters

In this app, the filter parameters are passed solely as text data. Since C++ does not have Reflection function like C# does, the strings have to be matched to class names manually. In the string parameter, filterParamters, filters are separated by semicolons (;) and the specific arguments are separated by comma (,). Within the GetFilter function, each filter name is matched up and the number of parameters is counted to determine which filter is returned to the main function

IFilter^ GetFilter(wchar_t* filterParamters)
IFilter^ filter;
wchar_t* token = NULL;
wchar_t *next_token = NULL;
token = wcstok_s(filterParamters, L",", &next_token);
std::wstring s = std::wstring(token);
std::vector<wchar_t*> v;
if (s == L"LomoFilter")
token = wcstok_s(NULL, L",", &next_token);
while (token != NULL)
token = wcstok_s(NULL, L",", &next_token);
switch (v.size())
case 0:
filter = ref new LomoFilter();
case 4:
filter = ref new LomoFilter(wcstod(v[0], NULL), wcstod(v[1], NULL), (LomoVignetting)_wtoi(v[2]), (LomoStyle)_wtoi(v[3]));
else if (s == L"SolarizeFilter")
filter = ref new SolarizeFilter();
return filter;

Warning.pngWarning: Currently, only LomoFilter and part of SolarizeFilter are ready to use with this solution. The template for using any other filter can be copied directly from these two.


This function allows the C++ transform to receive data from the C# portion of the app. When the transform is applied to the MediaCapture class via AddEffectAsync, only a string variable is passes and no instance of the transform is returned to the C# portion of the app. Therefore, one cannot pass data to the transform via conventional means. However, one of the parameters of the AddEffectAsync method takes an IPropertySet interface. This is essentially a dictionary of KeyValuePairs that must use WinRT-types.

This parameter comes through to the MF Transform through the SetProperties function.

        HRESULT hr = S_OK;
IPropertySet^ properties = reinterpret_cast<IPropertySet^>(pConfiguration);
IVector<String^>^ filterList = safe_cast<IVector<String^>^>(properties->Lookup(L"filterList"));
auto size = filterList->Size;
wchar_t temp[100] = { 0 };
for (int i = 0; i < size; i++)
auto commandKey = filterList->GetAt(i);
auto commandArray = safe_cast<String^>(properties->Lookup(commandKey));
wchar_t* tempWide = const_cast< wchar_t* >(commandArray->Data());
wcscat_s(temp, commandKey->Data());
wcscat_s(temp, L",");
wcscat_s(temp, tempWide);
m_altFilterParams = std::wstring(temp);
catch (Exception ^exc)
hr = exc->HResult;
return hr;

This function parses the content of the IPropertySet and converts it to a std::wstring in the class called m_altFilterParams. However, the transform functions cannot access this class-member variable, so it must be passed via a pointer as an argument to the transform functions. This is done in OnProcessOutput:

        // Invoke the image transform function.
assert(m_pTransformFn != nullptr);
if (m_pTransformFn)
auto s = m_altFilterParams;
(*m_pTransformFn)(m_rcDest, outputLock.GetTopRow(), outputLock.GetStride(), inputLock.GetTopRow(), inputLock.GetStride(), m_imageWidthInPixels, m_imageHeightInPixels,;

C# Universal App

Preparing the main App

In order to use the MF Transform that was just created, there are two steps that must be taken.

  1. The C++ app must be referenced in the C# app. WP8.1 to WP8.1 and Win8 to Win8.
  2. The MF Transform must be registered.

In a desktop app, this would be done in the registry. In a WinRT app, this is done in the Manifest file in an XML editor. Right-click on the Package.appxmanifest file in the WP8.1 app, select "Open with...", and choose an XML editor. In between Applications and Capabilities, insert:

<Extension Category="windows.activatableClass.inProcessServer">
<ActivatableClass ActivatableClassId="ImagingEffects.ImagingEffect" ThreadingModel="both" />

The corresponding code for Win8 app should just change WindowsPhone.dll to Windows.dll.

MediaCapture & CaptureElement

To setup the page for viewing video, simply place the CaptureElement somewhere on the page and set its source to the instance of MediaCapture generated.

Warning.pngWarning: Failing to shutdown the MediaCapture instance after video previewing has been started will result in your mobile device locking up! In one case, it corrupted the phone to an extent that no other solutions could be loaded onto the phone and a hard reset/device wipe was performed.

With WinRT, it is more difficult to capture the event at which you exit or navigate away from the app. Relying on the OnSuspending event is not enough. One solution to this is to monitor the CoreWindow VisibilityChanged events. When you switch to or from this app, the event will fire. In the OnNavigatedTo event handler, add the following:

_devices = await DeviceInformation.FindAllAsync(DeviceClass.VideoCapture);
await StartMediaCapture();
var core = Windows.UI.Core.CoreWindow.GetForCurrentThread();
core.VisibilityChanged += core_VisibilityChanged;

_devices is a DeviceInformationCollection that will contain all the components capable of media capture on your device. The core_VisibilityChanged event handler contains logic that turns the video off when the window is not visible anymore and turns the video back on when the window is visible.

Warning.pngWarning: The core_Visibility event handler will NOT fire when you manually stop debugging in VS2013! If you just stop debugging, the next time any app accesses the camera, it will lock up the device! Always exit the app or at least navigate to a different app to cause the event to fire and shut down the camera.

Capture Start

Before you can begin capturing video, the MediaCapture instance must be initialized using InitializeAsync along with correct VideoDeviceId being passed within the MediaCaptureInitializationSetings parameter.

app.MediaCapture = new MediaCapture();
var selectedDevice = _devices.FirstOrDefault(x => x.EnclosureLocation != null && x.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Back);
if (selectedDevice == null)
selectedDevice = _devices.First();
await app.MediaCapture.InitializeAsync(new MediaCaptureInitializationSettings
VideoDeviceId = selectedDevice.Id

The next major step is to select the resolution and ColorMode (SubType) of the preview and record streams. This is accomplished with the GetAvailableMediaStreamProperties on the MediaCapture's VideoDeviceController. The output is an IReadOnlyList of IMediaEncodingProperties. Given that a video device has been selected, casting the IMediaEncodingProperty to a VideoEncodingProperties will yield the information required. The output of the Preview and Recording streams will be different.

_encodingPreviewProperties = app.MediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(MediaStreamType.VideoPreview);
_encodingRecorderProperties = app.MediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(MediaStreamType.VideoRecord);
var selectedPreviewProperties = _encodingPreviewProperties.First(x => ((VideoEncodingProperties)x).Width == 800);
await app.MediaCapture.VideoDeviceController.SetMediaStreamPropertiesAsync(MediaStreamType.VideoPreview, selectedPreviewProperties);
var selectedRecordingProperties = _encodingRecorderProperties.First(x => ((VideoEncodingProperties)x).Width == _encodingRecorderProperties.Max(y => ((VideoEncodingProperties)y).Width));
await app.MediaCapture.VideoDeviceController.SetMediaStreamPropertiesAsync(MediaStreamType.VideoRecord, selectedRecordingProperties);

Note.pngNote: For mobile devices, it is better to use low bitrate capture resolutions for both the Preview and Recording streams due to high processing requirements.

Adding the effect comes next. This is accomplished by passing a simple string parameter to the AddEffectAsync method on the MediaCapture instance. The actual filters that should be used are passed within a PropertySet class as the third argument of AddEffectAsync. To the PropertySet instance, first, add a "filterList" with the name of the filters to be used as List of strings. Next, for each filter, use the filter name as a key and add a comma-delimited series of parameters for the filter. Add an empty string for no parameters.

PropertySet testSet = new PropertySet();
testSet.Add("filterList", new List<string>() { "LomoFilter" });
testSet.Add("LomoFilter", String.Format("0.5,0.8,{0},{1}", (int)LomoVignetting.High, (int)LomoStyle.Blue));
await app.MediaCapture.AddEffectAsync(MediaStreamType.VideoPreview, "ImagingEffects.ImagingEffect", testSet);

Finally, a call to StartPreviewAsync() will begin the camera preview.

Warning.pngWarning: Currently, only LomoFilter and SolarizeFilter can be used until more are added to the C++ component.


It is not necessary to stop the Preview when Recording is started. However, the Effect MUST be applied to only one of these streams, otherwise the effect will run TWICE within both the preview output and the recording.

First, call ClearEffectsAsync and pass the MediaStreamType.VideoPreview as an argument to remove the effect from the preview. Then, add the effect to the VideoRecord stream using the same filter parameters before called StartRecordToStorageFileAsync (or any other recording method).

await app.MediaCapture.ClearEffectsAsync(MediaStreamType.VideoPreview);
await app.MediaCapture.AddEffectAsync(MediaStreamType.VideoRecord, "ImagingEffects.ImagingEffect", params);
await app.MediaCapture.StartRecordToStorageFileAsync(MediaEncodingProfile.CreateMp4(VideoEncodingQuality.HD720p), _tempStorageFile);

Note.pngNote: The code for recording within the sample is not complete. Use with caution.


Version Hint

Windows Phone: [[Category:Windows Phone]]
[[Category:Windows Phone 7.5]]
[[Category:Windows Phone 8]]

Nokia Asha: [[Category:Nokia Asha]]
[[Category:Nokia Asha Platform 1.0]]

Series 40: [[Category:Series 40]]
[[Category:Series 40 1st Edition]] [[Category:Series 40 2nd Edition]]
[[Category:Series 40 3rd Edition (initial release)]] [[Category:Series 40 3rd Edition FP1]] [[Category:Series 40 3rd Edition FP2]]
[[Category:Series 40 5th Edition (initial release)]] [[Category:Series 40 5th Edition FP1]]
[[Category:Series 40 6th Edition (initial release)]] [[Category:Series 40 6th Edition FP1]] [[Category:Series 40 Developer Platform 1.0]] [[Category:Series 40 Developer Platform 1.1]] [[Category:Series 40 Developer Platform 2.0]]

Symbian: [[Category:Symbian]]
[[Category:S60 1st Edition]] [[Category:S60 2nd Edition (initial release)]] [[Category:S60 2nd Edition FP1]] [[Category:S60 2nd Edition FP2]] [[Category:S60 2nd Edition FP3]]
[[Category:S60 3rd Edition (initial release)]] [[Category:S60 3rd Edition FP1]] [[Category:S60 3rd Edition FP2]]
[[Category:S60 5th Edition]]
[[Category:Symbian^3]] [[Category:Symbian Anna]] [[Category:Nokia Belle]]

334 page views in the last 30 days.