×
Namespaces

Variants
Actions
(Difference between revisions)

Real-time camera viewfinder filters in Native code

From Nokia Developer Wiki
Jump to: navigation, search
Mansewiz (Talk | contribs)
(Mansewiz -)
Mansewiz (Talk | contribs)
(Mansewiz -)
Line 95: Line 95:
  
 
So, we end up copying two times the camera data, first from camera subsystem into our application, then from our application to the display subsystem. <br/>  
 
So, we end up copying two times the camera data, first from camera subsystem into our application, then from our application to the display subsystem. <br/>  
 
+
Note that for this example, my data buffer is in the RGBA format. It's a format easy to handle, and the most familiar for most of us. However, it is not very efficient in terms of size and image manipulation performance. Using YUV format/color space would make more sense.
 +
<br/>
 
That covers the big lines of the data handling of the application. I skipped the not-so-interesting code, get the source code of the full project from the link in top right corner of this page. The last thing to do is the filtering itself.
 
That covers the big lines of the data handling of the application. I skipped the not-so-interesting code, get the source code of the full project from the link in top right corner of this page. The last thing to do is the filtering itself.
  
{{VersionHint|Versions will be visible from this template when viewed in preview. You can delete this or leave it in the page as it is not displayed in final version}}
+
== The filtering in C++ ==
 +
For the filtering, I took the code from [http://hilbert-space.de/?p=22
 +
Nils Pipenbrinck's excellent blog entry on Neon optimization].
 +
<code c++>
 +
void WindowsPhoneRuntimeComponent::ConvertToGrayOriginal( Platform::WriteOnlyArray<int,1U>^ frameData)
 +
{
 +
uint8 * src = (uint8 *) frameData->Data;
 +
uint8 * dest = (uint8 *) frameData->Data;
 +
int n = frameData->Length;
 +
int i;
 +
for (i=0; i<n; i++)
 +
{
 +
int r = *src++; // load red
 +
int g = *src++; // load green
 +
int b = *src++; // load blue
 +
src++; //Alpha
 +
 
 +
// build weighted average:
 +
int y = (r*77)+(g*151)+(b*28);
 +
 
 +
// undo the scale by 256 and write to memory:
 +
 
 +
*dest++ = (y>>8);
 +
*dest++ = (y>>8);
 +
*dest++ = (y>>8);
 +
dest++;
 +
 
 +
}
 +
}
 +
</code>
 +
If you look in the project code, you will find the same filter, but optimized with the ARM Neon instruction set. I'll let you pick the one you prefer.
 +
== Wrapping it up ==
 +
Hopefully this short article will help you getting started writing native filters with Windows Phone 8. Remember to always keep the performance in mind, these types of applications are very CPU intensive. Using C++ for your filters is one way to improve the performance. If the performance gain is not enough for your use cases, you might have a look at hooking your camera directly into DirectX , or optimizing your filter with Neon.

Revision as of 00:31, 25 November 2012

This article explains how to create real-time camera filters, using native code (C++).

WP Metro Icon Multimedia.png
SignpostIcon XAML 40.png
WP Metro Icon DirectX.png
WP Metro Icon WP8.png
Article Metadata
Code ExampleTested with
Devices(s): Lumia 810, Lumia 820, Lumia 822, Lumia 920
Compatibility
Platform(s): Windows Phone 8
Windows Phone 8
Article
Created: Mansewiz (27 Nov 2012)
Last edited: Mansewiz (25 Nov 2012)

Note.pngNote: This is an "internal" entry in the Windows Phone 8 Wiki Competition 2012Q4. The author is a Nokia / Microsoft employee.

Contents

Introduction

In this article, we will have a quick look how one can create real-time filters for the camera viewfinder, using native code (C++). For sake of simplicity, the example will implement a simple gray filter. For that we will use the new Windows PRT camera API that are now available in the WP8 SDK. We will also use the possibility to write C++ code, another new functionality of the WP8 SDK. The end result will look like this:

NativeFilterDemo.png

The source code of the full project can be downloaded from the link in top right corner of this page.

Why Native filters?

Microsoft has published a very similar example to this project, where they convert the camera viewfinder images to grayscale, which works well in WP7 and WP8. Get it here. This works well, but the gray filter is quite simple, more complicated filters will require more computation, and the CPU limits are quickly reached when processing VGA frames (604x480) at 30 frame per seconds. The speed gain by going closer to the metal may be needed for complex algorithm. Also, you might already have your own image filters written in C++ for other platforms, that you can reuse without converting them to C#. Finally, as we will see in other wiki entries, the native side opens further optimization possibilities, like using DirectX or the ARM Neon instruction set.

Setting up the viewfinder

Our UI will be XAML based. The UI will control everything, while the C++ side is rather dumb, simply executing the filtering when it is asked to. Let's first create the projects for both the XAML and the C++ components:

  • Start by creating a new project, of type Windows Phone App. You will find the template under the Visual C#/Windows Phone category. That will be our UI.
  • Add a new project to your solution, of type Windows Phone Runtime Component. That template is under the Visual C++/Windows Phone category. That will be our image filter.


We then add a live camera stream to our UI. That is easily done by :

  • In your XAML, define a rectangle that will be painted using a video brush:
<Grid x:Name="LayoutRoot" Background="Transparent">
<Rectangle Width="640" Height="480" Canvas.ZIndex="1">
<Rectangle.Fill>
<VideoBrush x:Name="viewfinderBrush" />
</Rectangle.Fill>
</Rectangle>
</Grid>
  • In the page loaded event, Create a PhotoCaptureDevice, and set it as the source of the video brush:
Windows.Foundation.Size resolution = new Windows.Foundation.Size(640, 480);
m_camera = await PhotoCaptureDevice.OpenAsync(CameraSensorLocation.Back, resolution);
ViewfinderBrush.SetSource(m_camera);

By now, with these 10 lines of codes, you should have an application with a functional camera ! Note that in the last step, we have used the Windows PRT class Windows.Phone.Media.Capture.PhotoCaptureDevice which is new to Windows Phone 8. In WP7, one would use the Silverlight/.NET class Microsoft.Devices.PhotoCamera().

Talking with the native(s)

So far, that was easy, but now we need to be a bit careful. We will handle quite huge amount of data. The viewfinder frames, 640x480 pixels refreshed at a rate of 30 frames per second, will go back and forth between managed code (our UI) and the native site (our C++ filter). That's a lot of pixels per seconds! We have to avoid any useless copy operations, as copies (memory accesses) will hurt our performance.
We will implement the following sequence diagram, for each new viewfinder frame coming from the camera:

SequenceDiagram.png

To get the data of the frames coming from the camera, we will have to call to the platform functionality Windows::Phone::Media::Capture::ICameraCaptureDevice::GetPreviewBufferArgb. Let's look at its definition:

void GetPreviewBufferArgb(Platform::WriteOnlyArray<int, 1U>^ pixels)

That function fills an array that we provide with the camera data. That's a copy operation we can't avoid. The buffer is of type Platform::WriteOnlyArray, which according to MSDN documentation, is to be used when the caller passes an array for the method to fill . Makes sense. We will also use that buffer type to communicate between our managed component and our native component

The public interface of our native component will be:

    public ref class WindowsPhoneRuntimeComponent sealed
{
public:
WindowsPhoneRuntimeComponent();
void Initialize(Windows::Phone::Media::Capture::PhotoCaptureDevice^ captureDevice);
void NewViewfinderFrame( Platform::WriteOnlyArray<int,1U>^ frameData);
...
};

On the managed side, we allocate the buffer when the application is initialized, and reuse that buffer for every frames:

m_frameData = new int[(int)m_camera.PreviewResolution.Height * (int)m_camera.PreviewResolution.Width];

Every time the camera subsystem fires a PreviewFrameAvailable event we will call the native NewViewfinderFrame method who will take care of filling the buffer with camera data and filter that data.
When the UI side receives back the filtered camera data buffer, it is ready to be displayed. The UI side will do that with the following :

Deployment.Current.Dispatcher.BeginInvoke(delegate()
{
m_frameData.CopyTo(m_wb.Pixels, 0);
m_wb.Invalidate();
m_processingFrame = false;
});

Again, an unavoidable copy! This time the data is copied to a WriteableBitmap, m_wb. At the initialization phase, we defined m_wb as the source for the XAML Image component that displays our filtered viewfinder.

So, we end up copying two times the camera data, first from camera subsystem into our application, then from our application to the display subsystem.
Note that for this example, my data buffer is in the RGBA format. It's a format easy to handle, and the most familiar for most of us. However, it is not very efficient in terms of size and image manipulation performance. Using YUV format/color space would make more sense.
That covers the big lines of the data handling of the application. I skipped the not-so-interesting code, get the source code of the full project from the link in top right corner of this page. The last thing to do is the filtering itself.

The filtering in C++

For the filtering, I took the code from [http://hilbert-space.de/?p=22 Nils Pipenbrinck's excellent blog entry on Neon optimization].

void WindowsPhoneRuntimeComponent::ConvertToGrayOriginal( Platform::WriteOnlyArray<int,1U>^ frameData)
{
uint8 * src = (uint8 *) frameData->Data;
uint8 * dest = (uint8 *) frameData->Data;
int n = frameData->Length;
int i;
for (i=0; i<n; i++)
{
int r = *src++; // load red
int g = *src++; // load green
int b = *src++; // load blue
src++; //Alpha
 
// build weighted average:
int y = (r*77)+(g*151)+(b*28);
 
// undo the scale by 256 and write to memory:
 
*dest++ = (y>>8);
*dest++ = (y>>8);
*dest++ = (y>>8);
dest++;
 
}
}

If you look in the project code, you will find the same filter, but optimized with the ARM Neon instruction set. I'll let you pick the one you prefer.

Wrapping it up

Hopefully this short article will help you getting started writing native filters with Windows Phone 8. Remember to always keep the performance in mind, these types of applications are very CPU intensive. Using C++ for your filters is one way to improve the performance. If the performance gain is not enough for your use cases, you might have a look at hooking your camera directly into DirectX , or optimizing your filter with Neon.

434 page views in the last 30 days.

Was this page helpful?

Your feedback about this content is important. Let us know what you think.

 

Thank you!

We appreciate your feedback.

×