Real-time Blend Demo for Windows Phone 8.0

Real-time Blend Demo is a Nokia Developer example demonstrating real-time use of the blend filter provided by the Nokia Imaging SDK. The selected texture, using the selected blend mode and blend effect level, is applied to a stream of images from the phone's camera. The user can explore the results using different textures, blend modes and blend effect levels. Furthermore, the sample is demonstrating usage of a gradient input image (GradientImageSource), meaning that one of the available textures is generated in-code, instead of being a bitmap.

Version 1.2 adds blending of partial textures with drag, pinch-zoom, and rotate gestures for exact positioning, size, and orientation.

Compatibility

  • Compatible with Windows Phone 8.0 phones.
  • Tested on Nokia Lumia 520, Nokia Lumia 1020 and Nokia Lumia 1520.
  • Developed with Visual Studio 2012 Express for Windows Phone 8.
  • Compiling the project requires Nokia Imaging SDK.

Design

The application is a dual page app with a full screen viewfinder with blend effect applied on the main page, and a texture selection page for changing the active blend texture. On the main viewfinder page the camera preview image stream is blended with the selected texture using the selected blend mode and blend effect level. Changing the blend mode happens by tapping on the left and right indicators in the application button bar, and the blend effect level can be adjusted with the slider on the left. Active texture can be changed by tapping on the "change texture" button in the application button bar.

Figure 1. Lens flare texture and hardlight blend mode with effect level close to maximum

Figure 2. Texture selection page allows user to select the texture to be used with the blend effect

There are two types of textures: fullscreen and partial. Selecting partial texture enables texture positioning: user can move, scale and rotate the texture by drag and pinch gestures. Under the hood there are no differences, both fullscreen and partial textures are treated the same. The differentiation happens on UI level: selecting a partial texture enables the gesture handling code. Textures are simply semitransparent bitmaps, with the exception of the red-green-blue gradient texture. It is technically not a bitmap, but a Nokia Imaging SDK GradientImageSource, meaning that the gradient input image, the texture, is generated in-code.

Architecture overview

Figure 3. Application architecture overview class diagram (simplified)

The application structure consists basically of four key classes. The MainPage shows the camera preview image feed using a VideoBrush and a MediaElement which gets the modified (with blend effect) image stream from the CameraStreamSource, which in turn gets the modified frames from the NokiaImagingSDKEffects that is connected with the PhotoCaptureDevice.

Managing the camera preview image stream

The camera preview image stream is managed by the CameraStreamSource class. GetSampleAsync(...) method uses the NokiaImagingSDKEffects.GetNewFrameAndApplyEffect(...) to get the modified camera buffer and reports it to the MediaElement with the protected MediaStreamSource.ReportGetSampleCompleted(...) method.

...

public class CameraStreamSource : MediaStreamSource
{
    private readonly Dictionary<MediaSampleAttributeKeys, string> _emptyAttributes =
        new Dictionary<MediaSampleAttributeKeys, string>();

    private MediaStreamDescription _videoStreamDescription = null;
    private MemoryStream _frameStream = null;
    private ICameraEffect _cameraEffect = null;
    private long _currentTime = 0;
    private int _frameStreamOffset = 0;
    private int _frameTime = 0;
    private int _frameCount = 0;
    private Size _frameSize = new Size(0, 0);
    private int _frameBufferSize = 0;
    private byte[] _frameBuffer = null;

    ...
    
    protected override void OpenMediaAsync()
    {
        // Member variables are initialized here
        
        ...
    }

    protected override void GetSampleAsync(MediaStreamType mediaStreamType)
    {
        var task = _cameraEffect.GetNewFrameAndApplyEffect(_frameBuffer.AsBuffer(), _frameSize);

        // When asynchroneous call completes, proceed by reporting about the sample completion

        task.ContinueWith((action) =>
        {
            _frameStream.Position = 0;
            _currentTime += _frameTime;
            _frameCount++;

            var sample = new MediaStreamSample(_videoStreamDescription, _frameStream, _frameStreamOffset,
                                               _frameBufferSize, _currentTime, _emptyAttributes);
            ReportGetSampleCompleted(sample);
        }
    }

    ...
}

Displaying the camera preview image stream

Displaying the modified camera preview image stream on the screen is handled by the MainPage class. In the XAML declaration of the class there is a VideoBrush that renders the LayoutRoot grid's background.

<Grid x:Name="LayoutRoot" Tap="LayoutRoot_Tap" ManipulationStarted="LayoutRoot_ManipulationStarted" ManipulationDelta="LayoutRoot_ManipulationDelta">
    <Grid.Background>
        <VideoBrush x:Name="BackgroundVideoBrush"/>
    </Grid.Background>

    ...

</Grid>

In the C# code an instance of the CameraStreamSource is set as the source for a MediaElement and then the MediaElement is set as a source for the VideoBrush:

...

public class MainPage : PhoneApplicationPage
{
    private MediaElement _mediaElement = null;
    private CameraStreamSource _cameraStreamSource = null;

    ...

    private async void Initialize()
    {
        // Camera stream source is initialized here

        ...

        _mediaElement = new MediaElement();
        _mediaElement.Stretch = Stretch.UniformToFill;
        _mediaElement.BufferingTime = new TimeSpan(0);
        _mediaElement.SetSource(_cameraStreamSource);

        BackgroundVideoBrush.SetSource(_mediaElement);

        ...
    }

    ...

}

Applying the blend effect using a bitmap or an in-code generated gradient texture

Blend effect is applied by GetNewFrameAndApplyEffect(...) method in NokiaImagingSDKEffects class. Notice that if there is no valid texture image URI set with the SetTexture(...) method, then the Initialize(...) method automatically generates the red-green-blue texture in-code, using the Nokia Imaging SDK's GradientImageSource.

public class NokiaImagingSDKEffects
{
    private PhotoCaptureDevice _photoCaptureDevice = null;
    private CameraPreviewImageSource _cameraPreviewImageSource = null;
    private FilterEffect _filterEffect = null;
    private BlendFilter _blendFilter = null;
    private Uri _blendImageUri = null;
    private IImageProvider _blendImageProvider = null;
    private Semaphore _semaphore = new Semaphore(1, 1);

    ...

    public double EffectLevel { get; set; }

    public PhotoCaptureDevice CaptureDevice
    {
        set
        {
            if (_photoCaptureDevice != value)
            {
                while (!_semaphore.WaitOne(100));

                _photoCaptureDevice = value;

                Initialize();

                _semaphore.Release();
            }
        }
    }

    ...

    public async Task GetNewFrameAndApplyEffect(IBuffer frameBuffer, Size frameSize)
    {
        if (_semaphore.WaitOne(500))
        {
            var scanlineByteSize = (uint)frameSize.Width * 4; // 4 bytes per pixel in BGRA888 mode
            var bitmap = new Bitmap(frameSize, ColorMode.Bgra8888, scanlineByteSize, frameBuffer);

            if (_filterEffect != null)
            {
                _blendFilter.Level = EffectLevel;

                var renderer = new BitmapRenderer(_filterEffect, bitmap);
                await renderer.RenderAsync();
            }
            else
            {
                var renderer = new BitmapRenderer(_cameraPreviewImageSource, bitmap);
                await renderer.RenderAsync();
            }

            ...

            _semaphore.Release();
        }
    }

    public void SetTexture(Uri textureUri)
    {
        if (_semaphore.WaitOne(500))
        {
            Uninitialize();

            _blendImageUri = textureUri;

            Initialize();

            _semaphore.Release();
        }
    }

    private void Initialize()
    {
        _cameraPreviewImageSource = new CameraPreviewImageSource(_photoCaptureDevice);

        if (_blendImageUri != null)
        {
            // Using the texture set with the SetTexture method

            _blendImageProvider = new StreamImageSource((System.Windows.Application.GetResourceStream(_blendImageUri).Stream));
        }
        else
        {
            // No texture set with the SetTexture method, fallback to a in-code generated
            // red-green-blue gradient texture

            var colorStops = new GradientStop[]
            {
                new GradientStop() { Color = Color.FromArgb(0xFF, 0xFF, 0x00, 0x00), Offset = 0.0 }, // Red
                new GradientStop() { Color = Color.FromArgb(0xFF, 0x00, 0xFF, 0x00), Offset = 0.7 }, // Green
                new GradientStop() { Color = Color.FromArgb(0xFF, 0x00, 0x00, 0xFF), Offset = 1.0 }  // Blue
            };

            var gradient = new RadialGradient(new Point(0, 0), new EllipseRadius(1, 0), colorStops);
            var size = new Size(640, 480);

            _blendImageProvider = new GradientImageSource(size, gradient);
        }

        switch (_effectIndex)
        {
            case 0:
                {
                    EffectName = "1/16 - None";
                }
                break;

            case 1:
                {
                    EffectName = "2/16 - Normal";
                    _blendFilter = new BlendFilter(_blendImageProvider, BlendFunction.Normal, EffectLevel);
                }
                break;

            ...
        }

        if (_blendFilter != null)
        {
            _blendFilter.TargetArea = _targetArea;
            _blendFilter.TargetAreaRotation = _targetAreaRotation;

            var filters = new List<IFilter>();

            filters.Add(_blendFilter);

            _filterEffect = new FilterEffect(_cameraPreviewImageSource)
            {
                Filters = filters
            };
        }
    }

    ...
}

Positioning the blend effect

By default the BlendFilter foreground image is scaled to fill the background image. The position and size of the foreground image can be defined by using the TargetArea property of the BlendFilter. The TargetArea specifies a rectangular area where the foreground image gets drawn. The TargetArea coordinates are represented in unit coordinate space relative to the background image where (0, 0) is the top left corner and (1, 1) is the bottom right corner. The foreground image gets scaled and stretched to fit the TargetArea according to TargetOutputOption property. The TargetArea can also be rotated by setting the TargetAreaRotation property.

Figure: Positioned, rotated and scaled blend filter applied to camera viewfinder image

Figure: Partial filter texture selection view

In Real-time Blend demo the textures selected from "Partial" category can be moved, scaled and rotated by using drag and pinch gestures. The gesture event handling is implemented in MainPage class by handling manipulation events of LayoutRoot element in LayoutRoot_ManipulationDelta method. The TargetArea and TargetAreaRotation of the BlendFilter are set in SetTargetArea method of NokiaImagingSDKEffects class. Note that semaphore is used to prevent modification of TargetArea while the filter is being rendered.

public void SetTargetArea(Rect targetArea, double targetAreaRotation)
{
    if (_semaphore.WaitOne(500))
    {
        _targetArea = targetArea;
        _targetAreaRotation = targetAreaRotation;
        if (_blendFilter != null)
        {
            _blendFilter.TargetArea = targetArea;
            _blendFilter.TargetAreaRotation = targetAreaRotation;
        }

        _semaphore.Release();
    }
}

Downloads

Real-time Blend Demo project v1.2 real-time-blend-demo-v1.2.zip

Get the app from Windows Phone store › 

This example application is hosted in GitHub, where you can check the latest activities, report issues, browse source, ask questions, or even contribute to the project yourself.

Note: Due to the time required to pass the store certification process, the version in the store may not always be the latest version.


Last updated 2 April 2014

Back to top

Was this page helpful?

Your feedback about this content is important. Let us know what you think.

 

Thank you!

We appreciate your feedback.

×