Real-time Filter Demo for Windows Phone 8.0

Real-time Filter Demo is a Nokia example application demonstrating the use of the Nokia Imaging SDK for real-time image effects. The effects are applied to the stream received from the camera and shown in the viewfinder. This app does not support capturing photos.

Compatibility

  • Compatible with Windows Phone 8.0 phones.
  • Tested with Nokia Lumia 920 and Nokia Lumia 520.
  • Developed with Visual Studio 2012 Express for Windows Phone 8.
  • Compiling the project requires Nokia Imaging SDK.

Design

The user interface (UI) design of the application is very simplistic; the filter is changed using the buttons in the application bar. The filter index and name is shown on the bottom-left corner. The frame rate is shown on the bottom-right corner. The application menu contains a menu item for displaying the about page with information about the example.

Architecture Overview 

Figure 1. Application architecture. Note that many methods, properties and members are omitted.

The example consists basically of three key classes. The main page is your typical phone application page implemented by a XAML file and a C# counterpart. The main page implements the application UI including the ​MediaElement which displays the camera viewfinder with an effect. The MainPage class also owns the instances of the two other key classes: CameraStreamSource and NokiaImagingSDKEffects. The CameraStreamSource, derived from ​MediaStreamSource, provides the camera data. The NokiaImagingSDKEffects implements all the effects of the application.

Managing the camera stream

The camera stream is managed by the CameraStreamSource class. GetSampleAsync(...) method uses the NokiaImagingSDKEffects.GetNewFrameAndApplyEffect(...) to get the modified camera buffer:

...

public class CameraStreamSource
{
    private readonly Dictionary<MediaSampleAttributeKeys, string> _emptyAttributes =
        new Dictionary<MediaSampleAttributeKeys, string>();

    private MediaStreamDescription _videoStreamDescription = null;
    private MemoryStream _frameStream = null;
    private ICameraEffect _cameraEffect = null;
    private long _currentTime = 0;
    private int _frameStreamOffset = 0;
    private int _frameTime = 0;
    private int _frameCount = 0;
    private Size _frameSize = new Size(0, 0);
    private int _frameBufferSize = 0;
    private byte[] _frameBuffer = null;

    ...

    protected override void OpenMediaAsync()
    {
        // Member variables are initialized here

        ...
    }

    protected override void GetSampleAsync(MediaStreamType mediaStreamType)
    {
        var task = _cameraEffect.GetNewFrameAndApplyEffect(_frameBuffer.AsBuffer(), _frameSize);
        
        // When asynchroneous call completes, proceed by reporting about the sample completion

        task.ContinueWith((action) =>
        {
            _frameStream.Position = 0;
            _currentTime += _frameTime;
            _frameCount++;

            var sample = new MediaStreamSample(_videoStreamDescription, _frameStream, _frameStreamOffset,
                                               _frameBufferSize, _currentTime, _emptyAttributes);

            ReportGetSampleCompleted(sample);
        }
    }

    ...
}

Linking the stream to the media element on the screen

Displaying the manipulated camera buffer on the screen is handled by the MainPage class. In the XAML declaration of the class we have a VideoBrush rendering the LayoutRoot grid's background.

<Grid x:Name="LayoutRoot" Tap="LayoutRoot_Tap">
    <Grid.Background>
        <VideoBrush x:Name="BackgroundVideoBrush"/>
    </Grid.Background>

    ...

</Grid>

In the C# code we set our custom CameraStreamSource as the source for a MediaElement and then set the MediaElement as a source for the VideoBrush:

...

public class MainPage : PhoneApplicationPage
{
    private MediaElement _mediaElement = null;
    private CameraStreamSource _cameraStreamSource = null;
    
    ...

    private async void Initialize()
    {
        // Camera stream source is initialized here

        ...

        _mediaElement = new MediaElement();
        _mediaElement.Stretch = Stretch.UniformToFill;
        _mediaElement.BufferingTime = new TimeSpan(0);
        _mediaElement.SetSource(_cameraStreamSource);

        BackgroundVideoBrush.SetSource(_mediaElement);

        ...
    }

    ...
}

Applying the effect

The effect is applied by GetNewFrameAndApplyEffect(...) method in NokiaImagingSDKEffects class. The process is very simple. First you query the image data from the camera device as a buffer with ​PhotoCaptureDevice.GetPreviewBufferArgb(). Then you create a new bitmap which is associated with the processedBuffer argument. Finally create an EditingSession for the camera buffer, apply an effect and render it into the new buffer.

public class NokiaImagingSDKEffects
{
    private PhotoCaptureDevice _photoCaptureDevice = null;
    private CameraPreviewImageSource _cameraPreviewImageSource = null;
    private FilterEffect _filterEffect = null;
    private CustomEffectBase _customEffect = null;
    private Semaphore _semaphore = new Semaphore(1, 1);

    ...

    public PhotoCaptureDevice CaptureDevice
    {
        set
        {
            if (_photoCaptureDevice != value)
            {
                while (!_semaphore.WaitOne(100));

                _photoCaptureDevice = value;

                Initialize();

                _semaphore.Release();
            }
        }
    }

    ...

    public async Task GetNewFrameAndApplyEffect(IBuffer frameBuffer, Size frameSize)
    {
        if (_semaphore.WaitOne(500))
        {
            _cameraPreviewImageSource.InvalidateLoad(); // Invalidate camera frame

            var scanlineByteSize = (uint)frameSize.Width * 4; // 4 bytes per pixel in BGRA888 mode
            var bitmap = new Bitmap(frameSize, ColorMode.Bgra8888, scanlineByteSize, frameBuffer);

            if (_filterEffect != null)
            {
                var renderer = new BitmapRenderer(_filterEffect, bitmap);
                await renderer.RenderAsync();
            }
            else if (_customEffect != null)
            {
                var renderer = new BitmapRenderer(_customEffect, bitmap);
                await renderer.RenderAsync();
            }

            ...

            _semaphore.Release();
        }
    }

    private void Initialize()
    {
        _cameraPreviewImageSource = new CameraPreviewImageSource(_photoCaptureDevice);
      
        var filters = new List<IFilter>();

        ...

        switch (_effectIndex)
        {
            case 0:
                {
                    filters.Add(new LomoFilter(0.5, 0.5, LomoVignetting.High, LomoStyle.Yellow));
                }
                break;

            ...

            case 10:
                {
                    _customEffect = new CustomEffect(_cameraPreviewImageSource);
                }
                break;
        }

        if (filters.Count > 0)
        {
            _filterEffect = new FilterEffect(_cameraPreviewImageSource)
            {
                Filters = filters
            };
        }
    }

    ...
}

Implementing custom effects

It is also easy to implement own custom effects that can be used instead of the FilterEffect that comes with the Nokia Imaging SDK. It is important to notice that custom effects are not instances of IFilter used with FilterEffect, but they are instead derived from CustomEffectBase that is on the same hierarchy level as FilterEffect and thus meant to be used in similar manner in the rendering workflow.

Here's a simple custom effect that does a weighted inverse grayscale effect on the image:

public class CustomEffect : CustomEffectBase
{
    ...

    public CustomEffect(IImageProvider source) : base(source)
    {
    }

    protected override void OnProcess(PixelRegion sourcePixelRegion, PixelRegion targetPixelRegion)
    {
        var sourcePixels = sourcePixelRegion.ImagePixels;
        var targetPixels = targetPixelRegion.ImagePixels;

        sourcePixelRegion.ForEachRow((index, width, position) =>
        {
            for (int x = 0; x < width; ++x, ++index)
            {
                // the only supported color format is ColorFormat.Bgra8888

                uint pixel = sourcePixels[index];
                uint blue = pixel & 0x000000ff; // blue color component
                uint green = (pixel & 0x0000ff00) >> 8; // green color component
                uint red = (pixel & 0x00ff0000) >> 16; // red color component
                uint average = (uint)(0.0722 * blue + 0.7152 * green + 0.2126 * red); // weighted average component
                uint grayscale = 0xff000000 | average | (average << 8) | (average << 16); // use average for each color component

                targetPixels[index] = ~grayscale; // use inverse grayscale
            }
        });
    }
}

See the CustomEffectBase section under the Core concepts page for another example and more details.

Downloads

Real-time Filter Demo project v2.0 real-time-filter-demo-2.0.zip

Get the app from Windows Phone store > 

This example application is hosted in GitHub, where you can check the latest activities, report issues, browse source, ask questions, or even contribute to the project yourself.

Note: Due to the time required to pass the store certification process, the version in the store may not always be the latest version.


Last updated 14 April 2014

Back to top

Was this page helpful?

Your feedback about this content is important. Let us know what you think.

 

Thank you!

We appreciate your feedback.

×