×
Namespaces

Variants
Actions
(Difference between revisions)

Optimizing Imaging SDK use for rapidly changing filter parameters

From Nokia Developer Wiki
Jump to: navigation, search
hamishwillee (Talk | contribs)
m (Hamishwillee - Fix categories)
hamishwillee (Talk | contribs)
m (Hamishwillee - Fix link)
 
(44 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
[[Category:Nokia Imaging SDK]][[Category:Optimization on Windows Phone]][[Category:Code Examples]][[Category:XAML]][[Category:Windows Phone 8]]
 
[[Category:Nokia Imaging SDK]][[Category:Optimization on Windows Phone]][[Category:Code Examples]][[Category:XAML]][[Category:Windows Phone 8]]
 +
{{Note|This is an entry in the [[Nokia Imaging and Big UI Wiki Competition 2013Q4]].}}
 
{{FeaturedArticle|timestamp=20130929}}
 
{{FeaturedArticle|timestamp=20130929}}
{{Abstract|This article explains how to use the Nokia Imaging SDK with user interaction efficiently. }}
+
{{Abstract|This article explains how to use the Nokia Imaging SDK efficiently during rapid user interaction. }}
 +
{{SeeAlso|[http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/sample-projects/filter-effects.html;#toc_Modifyingfilterpropertiesonthefly Modifying filter properties on the fly] (Lumia Developers' Library)}}
 
{{ArticleMetaData <!-- v1.3 -->
 
{{ArticleMetaData <!-- v1.3 -->
|sourcecode= [[Media:interactiveFiltering.zip]], [[Media:FilterEffects_InteractiveStateMachine.zip]]
+
|sourcecode= [[Media:interactiveFiltering.zip]]
 
|installfile= <!-- Link to installation file (e.g. [[Media:The Installation File.wgt]]) -->
 
|installfile= <!-- Link to installation file (e.g. [[Media:The Installation File.wgt]]) -->
 
|devices= Lumia 920, 820, 620,  
 
|devices= Lumia 920, 820, 620,  
|sdk= Windows Phone 8.0 SDK, Nokia Imaging SDK Beta 1
+
|sdk= Windows Phone 8.0 SDK
|dependencies= Nokia Imaging SDK
+
|dependencies= Nokia Imaging SDK 1.0
 
|signing= <!-- Special Signing requirements -->
 
|signing= <!-- Special Signing requirements -->
 
|capabilities= <!-- Required capabilities for code (e.g. ID_CAP_LOCATION, ID_CAP_NETWORKING) -->
 
|capabilities= <!-- Required capabilities for code (e.g. ID_CAP_LOCATION, ID_CAP_NETWORKING) -->
Line 27: Line 29:
  
 
This article provides an an overview of filter use and "limitations", an explanation of how they can be misused (taking the slider case as an example) and provides a state machine that can be used to handle the user interaction more effectively.
 
This article provides an an overview of filter use and "limitations", an explanation of how they can be misused (taking the slider case as an example) and provides a state machine that can be used to handle the user interaction more effectively.
 
{{Tip|The "Filter effects" example in the Imaging SDK uses a "naive" implementation. An improved version is provided at the [[#Improved Filter Effects SDK sample|end]] of this article.}}
 
  
 
== Pre-requisites  ==
 
== Pre-requisites  ==
Line 36: Line 36:
 
* [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/adding-libraries-to-the-project.html Download and add the libraries to the project](Nokia Imaging SDK documentation)
 
* [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/adding-libraries-to-the-project.html Download and add the libraries to the project](Nokia Imaging SDK documentation)
 
* [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/quick-start.html Quick Start] (Nokia Imaging SDK documentation)
 
* [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/quick-start.html Quick Start] (Nokia Imaging SDK documentation)
* [[How to Use the Nokia Imaging SDK to create a "Photo Cookbook" for Windows Phone 8 Devices]]
 
* [[Filter Testing app using the Imaging SDK]]
 
  
== EditingSession ==
 
To manipulate a picture you must create an {{Icode|EditingSession}}. This class can use a picture pixel buffer or work directly with an image file.  To work with high resolution pictures, you must use the encoded file (the SDK will decode only the pixels it needs).
 
  
If you want to use an encoded file stream or {{Icode|StorageFile}}, you should use {{Icode|EditingSessionFactory}}. This factory function will copy the encoded file to a {{Icode|IBuffer}} object and instantiate an {{Icode|EditingSession}} asynchronously.
+
== Imaging SDK==
 +
To manipulate a picture, the Imaging SDK defines two interfaces :
 +
* {{Icode|IImageProvider}}: Input class which provides pixels.
 +
* {{Icode|IImageComsumer}}: Output class which consumes pixels provided by a {{Icode|IImageProvider}}.
 +
 
 +
 
 +
These interfaces are used to make a pipeline between an Image source and a renderer:
 +
[[File:BasicBuildingBlocks.jpg|none|frame|Typical pipeline (Equivalent to {{Icode|EditingSession}} from the beta version of the Imaging SDK) ]]
 +
 
 +
 
 +
Image sources implement {{Icode|IImageProvider}}:
 +
*{{Icode|BitmapImageSource}}: Picture is decoded in a {{Icode| Bitmap}} instance.
 +
*{{Icode|BufferImageSource}}: Picture is an encoded file loaded in a IBuffer.
 +
*{{Icode|RandomAccessStreamImageSource}}: access the encoded Picture file with a {{Icode| IRandomAccessStream}}.
 +
*{{Icode|StorageFileImageSource}}: access the encoded Picture file with a {{Icode| IStorageFile}}.
 +
*{{Icode|StreamImageSource}}: access the encoded Picture file with a {{Icode|System.IO.Stream}}.
 +
* etc...
 +
 
 +
{{Note|The SDK can read Jpeg, Png, Gif , Bmp, Wbmp and Tiff files}}
 +
To work with high resolution pictures, you must use a {{Icode|IImageProvider}} which uses the image file (the SDK will decode only the pixels it needs).
 +
 
 +
Renderers implement {{Icode|IImageComsumer}}:
 +
* {{Icode|BitmapRenderer}}: render to a {{Icode|Nokia.Graphics.Imaging.Bitmap}}
 +
* {{Icode|WriteableBitmapRenderer}}: render to a {{Icode|WriteableBitmapRenderer}}
 +
* {{Icode|JpegRenderer}}: encode the rendering to a JPG and return the file in an {{Icode|IBuffer}}
 +
 
 +
 
 +
The SDK provides classes which implement the {{Icode|IImageComsumer}} and {{Icode|IImageProvider}} interface to modify pixels between an {{Icode|ImageSource}} and a Renderer :
 +
*  {{Icode|FilterEffect}}: applies a collection of {{Icode|IFilter}}. This collection is accessible through the {{Icode|Filter}} property and filters will be applied in order.
 +
*  {{Icode|CustomEffectBase}}: used to implement a custom filter.
 +
{{Note|when you use a {{Icode|FilterEffect}}, the {{Icode|IFilter}} collection is accessible with the {{Icode|Filter}} property. You can easily add, remove and move filters. These filters are applied in order.}}
 +
{{Icode|IImageProvider}} can be shared between pipelines. For example, to create a different pipeline which uses the same Image source.
 +
{{Warning|The rendering process is asynchronous, and you can't reuse an {{Icode|IImageProvider}} until processing completes.}}
  
 
===  Unmanaged resources ===
 
===  Unmanaged resources ===
The Nokia Imaging SDK is a WinPRT component coded in C++ and allocating unmanaged resources. Since C# uses a Garbage Collector, you don't know when an {{Icode|EditingSession}}, with its unmanaged resources, will be deleted. Application memory can grow quickly if developers do not take specific action to release them promptly.
+
The Nokia Imaging SDK is a WinPRT component coded in C++ and allocating unmanaged resources. Since C# uses a Garbage Collector, you don't know when its classes, with their unmanaged resources, will be deleted. Application memory can grow quickly if developers do not take specific action to release them promptly.
  
For this reason, {{Icode|EditingSession}} implements the {{Icode|IDisposable}} interface. This interface indicates that you can de-allocate unmanaged resources by calling the {{Icode|Dispose()}} function.
+
For this reason, Image sources, {{Icode|FilterEffect}}s and renderers implement the {{Icode|IDisposable}} interface. This interface indicates that you can de-allocate unmanaged resources by calling the {{Icode|Dispose()}} function.
  
 
To simplify {{Icode|IDisposable}} class use, C# provides the {{Icode|using}} keyword. This keyword is equivalent to {{Icode|try}}/{{Icode|finally}} where {{Icode|Dispose()}} is called in the {{Icode|finally}} section.
 
To simplify {{Icode|IDisposable}} class use, C# provides the {{Icode|using}} keyword. This keyword is equivalent to {{Icode|try}}/{{Icode|finally}} where {{Icode|Dispose()}} is called in the {{Icode|finally}} section.
Line 54: Line 82:
 
! With using keyword
 
! With using keyword
 
<code csharp>
 
<code csharp>
//define session variable and instantiate an EditingSession
+
//define the pipeline :  source => effect => renderer
using(var session =   await EditingSessionFactory.CreateEditingSessionAsync(inputStream))
+
using (var source = new StreamImageSource(input))
 +
using (var effect = new FilterEffect(source))
 +
using (var renderer = new WriteableBitmapRenderer(effect, outputBitmap))      
 
{
 
{
     session.addFilter( FilterFactory.CreateAntiqueFilter() );
+
     effect.Filters = new IFilter[]{ new CreateAntiqueFilter()};
     await _session.RenderToImageAsync(resultImage);
+
     await renderer.RenderAsync();
}//session.Dispose is called  
+
}//Dispose is called on source, effect and renderer.
 
</code>
 
</code>
 
|-
 
|-
Line 68: Line 98:
  
 
<code csharp>
 
<code csharp>
 +
StreamImageSource source = null;
 +
FilterEffect effect = null;
 +
WriteableBitmapRenderer renderer = null;
 +
try
 
{
 
{
     EditingSession session = null;
+
     //instantiate the pipeline
     try
+
     source = new StreamImageSource(input))
     {
+
     effect = new FilterEffect(source))
        //instantiate an EditingSession
+
    renderer = new WriteableBitmapRenderer(effect, outputBitmap))
        session = await EditingSessionFactory.CreateEditingSessionAsync(inputStream));
+
  
        session.addFilter( FilterFactory.CreateAntiqueFilter() );
+
    effect.Filters = new IFilter[]{ new CreateAntiqueFilter()};
        await _session.RenderToImageAsync(resultImage);
+
    await renderer.RenderAsync();
    }
+
}
    finally
+
finally
    {
+
{
        if(session != null)    session.Dispose(); //call Dispose
+
    if(source != null)    source.Dispose(); //call Dispose
        session = null;
+
    source = null;
     }
+
 
 +
     if(effect != null)    source.Dispose(); //call Dispose
 +
    effect = null;
 +
 
 +
    if(renderer != null)    source.Dispose(); //call Dispose
 +
    renderer = null;
 
}
 
}
 
</code>
 
</code>
Line 88: Line 126:
 
|}
 
|}
  
Please call {{Icode|Dipose()}} manually or use {{Icode|using}} keyword - depending on your application context. The {{Icode|using}} keyword is preferred if you want to apply an effect on your picture and then won't need to further re-use the {{Icode|EditingSession}}.
+
Please call {{Icode|Dipose()}} manually or use {{Icode|using}} keyword - depending on your application context. The {{Icode|using}} keyword is preferred if you want to apply an effect on your picture and then won't need to further re-use the Image source.
  
=== Filters ===
+
=== Render to an Image Control ===
After you have instantiated an {{Icode|EditingSession}}, you need to stack filters with the {{Icode|AddFilter()}} function. Each filter is instantiated with {{Icode|FilterFactory}} and the instance is {{Icode|Invariant}} (i.e. its parameters can't be modified).
+
  
So, if you want to modify a filter in the stack, you must remove it and add the modified filter. {{Icode|EditingSession}} provides only two methods to remove a filter:
+
When you want display the rendering result, it's better to use a {{Icode|WriteableBitmap}} which will not be scaled by the Image Control. .NET control dimensions are in logical pixels. To avoid scaling you must convert the Image control size in physical pixels and use it for your {{Icode|WriteableBitmap}}. To convert logical pixel dimensions to real pixel dimensions you need to get the factor between logical and physical pixels dimension. You could use the factor given by {{Icode|System.Windows.Application.Current.Host.Content.ScaleFactor}}. Unfortunately, Microsoft have decided that this value will be the same for '''720P''' and '''1080p''' screen. GDR3 gets the physical screen resolution with {{Icode|DeviceExtendedProperties.GetValue("PhysicalScreenResolution"))}}. As a Windows Phone device always has the same width in logical dimension (480), it is simple to compute the correct factor and use a {{Icode|WriteableBitmap}} with an optimized size:
* {{Icode|Undo()}}: remove the last filter added to the stack.
+
<code csharp>
* {{Icode|UndoAll()}} : remove all filters.
+
private double _ScreenToPixelFactor = 0;
 +
private double ScreenToPixelFactor
 +
{
 +
    get
 +
    {
 +
        if (_ScreenToPixelFactor == 0)
 +
        {
 +
            try
 +
            {
 +
                _ScreenToPixelFactor = ((System.Windows.Size)DeviceExtendedProperties.GetValue("PhysicalScreenResolution")).Width / 480;
 +
            }
 +
            catch (Exception)
 +
            {
 +
                _ScreenToPixelFactor = System.Windows.Application.Current.Host.Content.ScaleFactor / 100.0;
 +
            }
 +
        }
 +
        return _ScreenToPixelFactor;
 +
    }
 +
}
  
If you want to modify only the last filter added to the stack, use {{Icode|Undo()}}. If you want to modify another filter, it is simpler to use {{Icode|UndoAll()}} and recreate the filter stack.
+
...
 
+
var displayedBbitmap = new WriteableBitmap((int)(imageControl.ActualWidth * ScreenToPixelFactor), (int)(imageControl.ActualHeight * ScreenToPixelFactor));
Once filters have been added to the session, you can asynchronously generate the final image to a {{Icode|Bitmap}} with {{Icode|RenderToBitmapAsync()}} methods, or to a stream with {{Icode|RenderToJpegAsync()}} methods.
+
imageControl.Source = displayedBbitmap ;
 
+
</code>
{{Warning|Rendering is asynchronous, and you can't reuse the session until processing completes.}}
+
  
 
== User interaction ==
 
== User interaction ==
 +
It's possible to share an {{Icode|IImageProvider}} between several pipelines - note however that once rendering has started the {{Icode|IImageProvider}} cannot be reused until it completes. Applying image effects with the Nokia Imaging SDK is easy and efficient, so usually these limitations have little effect on app design.
  
Filters are invariant - every time a parameter is changed a new filter(s) needs to be created. Furthermore, after we start rendering an image we can't reuse a {{Icode|EditingSession}} until it completes. Applying image effects with the Nokia Imaging SDK is easy and efficient, so usually these limitations have little effect on app design.
+
There are however user interactions where a naive design can have an impact - for example if parameters can change rapidly a poor implementation will result in the creation of unnecessary pipelines for each new value.
  
There are however user interactions where a naive design can have an impact - for example if parameters can change rapidly resulting in creation of unnecessary {{Icode|EditingSession}} and filter objects.
+
The following sections explain this problem, and the solution, in the context of a slider controlling a filter parameter on an image. To simplify code, we use stream file picture as {{Icode|StreamImageSource}} input and a {{Icode|WriteableBitmapRender}} output. The {{Icode|WriteableBitmap}} size is optimized as explained in the previous section and displayed by an {{Icode|Image}} control.
 
+
The following sections explain this problem, and the solution, in the context of a slider controlling a filter parameter on an image. To simplify code, we use stream file picture as {{Icode|EditingSession}} input and {{Icode|Image}} control as rendering output.
+
  
 
=== Naive method ===
 
=== Naive method ===
Line 116: Line 169:
 
private void filterparam_ValueChanged(object sender, RoutedPropertyChangedEventArgs<double> e)
 
private void filterparam_ValueChanged(object sender, RoutedPropertyChangedEventArgs<double> e)
 
{
 
{
  using (EditingSession session = await EditingSessionFactory.CreateEditingSessionAsync(input))
+
    //reset stream position
  {
+
    input.Seek(0, SeekOrigin.Begin);
        //create filter stack
+
        session.AddFilter(FilterFactory.CreateSolarizeFilter(filterValue));
+
  
         //process the rendering
+
    using (var source = new StreamImageSource(input))
         await session.RenderToImageAsync(output);
+
    using (var effect = new FilterEffect(source))
 +
    using (var renderer = new WriteableBitmapRenderer(effect, outputBitmap))
 +
    {
 +
         effect.Filters = new IFilter[]{
 +
                                new SolarizeFilter(filterValue),
 +
                                new TemperatureAndTintFilter(1 -  filterValue, 0.0)};
 +
 
 +
         await renderer.RenderAsync();
 +
        outputBitmap.Invalidate();
 
     }
 
     }
}
+
}
 
</code>
 
</code>
  
{{Warning|The session can't be reused while it is processing. Therefore in this case a new {{Icode|EditingSession}} is created at each raised event.}}
+
{{Warning|The pipeline can't be reused while renderer is processing. Therefore in this case a new pipeline is created for each new event.}}
 +
{{Note|To manipulate a picture, we use the file stream. So we need to reset its position before each pipeline creation.}}
  
 
When a user moves the slider, many events are raised. With the above implementation each event results in recreation of the session and filters and a new asynchronous (parallel!) process to apply the effect to the image. Not only is generation of the parallel images not necessary, but this approach has the following implications:
 
When a user moves the slider, many events are raised. With the above implementation each event results in recreation of the session and filters and a new asynchronous (parallel!) process to apply the effect to the image. Not only is generation of the parallel images not necessary, but this approach has the following implications:
 
* The app will use 100% of CPU and allocate a lot of unmanaged resources.
 
* The app will use 100% of CPU and allocate a lot of unmanaged resources.
 
* The application GUI can be affected, resulting in "jerky" rendering of the image.  
 
* The application GUI can be affected, resulting in "jerky" rendering of the image.  
* Processing time is not constant and you don't know which is the real parameters apply on the final displayed image.
+
* Processing time is not constant and you don't know which are the real parameters to apply on the final displayed image.
 +
 
  
 
=== Naive implementation code example ===
 
=== Naive implementation code example ===
Sample code implement this solution in '''NokiaImagingFilter.cs''' file.  
+
Sample code implementing this "naive" solution is in the '''NokiaImagingFilter.cs''' file.  
  
 
This class is composed by three properties and one function:
 
This class is composed by three properties and one function:
Line 141: Line 202:
 
* {{Icode|Output}}: target {{Icode|Image}} controls.
 
* {{Icode|Output}}: target {{Icode|Image}} controls.
 
* {{Icode|filterValue}}: filter parameter. Range [0.0, 1.0]
 
* {{Icode|filterValue}}: filter parameter. Range [0.0, 1.0]
* {{Icode|processRendering()}}: process the rendering to the {{Icode|Image}} control
+
* {{Icode|processRendering()}}: process the rendering to a {{Icode|WriteableBitmap}}.
<code csharp>
+
async void processRendering()
+
{
+
    if (output == null || input == null)
+
        return;
+
    try
+
    {
+
        //reset stream position
+
        input.Seek(0, SeekOrigin.Begin);
+
        //create a new session
+
        using (EditingSession session = await EditingSessionFactory.CreateEditingSessionAsync(input))
+
        {
+
            //create filter pipeline
+
            session.AddFilter(FilterFactory.CreateSolarizeFilter(filterValue));
+
            session.AddFilter(FilterFactory.CreateTemperatureAndTintFilter((int)(100 - 200 * filterValue), 0));
+
  
            //process rendering to an Image control.
 
            await session.RenderToImageAsync(output, OutputOption.PreserveAspectRatio);
 
        }
 
    }
 
    catch (Exception)
 
    {
 
    }
 
}
 
</code>
 
{{Note|To manipulate a picture, we use the file stream. So we need to reset its position before the {{Icode|EditionSession}} creation.}}
 
  
 
When a property is modified, {{Icode|processRendering()}} function is called.
 
When a property is modified, {{Icode|processRendering()}} function is called.
Line 175: Line 211:
  
 
The approach we use to improve naive method is to create a very simple state machine.
 
The approach we use to improve naive method is to create a very simple state machine.
[[File:InteractiveStateMachineDiagram.png|none|frame| Interactive State Machine  Diagram]]
+
[[File:InteractiveStateMachineDiagram.png|none|frame|Interactive State Machine  Diagram]]
 
{{Note| Interactive State Machine is really simple to implement and you can easily adapt it for other context.}}
 
{{Note| Interactive State Machine is really simple to implement and you can easily adapt it for other context.}}
 
This State machine has three ''States'':
 
This State machine has three ''States'':
Line 202: Line 238:
 
Sample code implement this solution in '''InteractiveNokiaImagingFilter.cs''' file.
 
Sample code implement this solution in '''InteractiveNokiaImagingFilter.cs''' file.
  
The class is composed of three properties and three functions:
+
The class is composed of nine properties  
 
* {{Icode|Input}}: file {{Icode|stream}}
 
* {{Icode|Input}}: file {{Icode|stream}}
* {{Icode|Output}}: target {{Icode|Image}} controls.
+
* {{Icode|Output}}: {{Icode|Image}} controls.
 +
* {{Icode|outputBitmap}} : writeableBitmap displayed by the {{Icode|Output}}.
 
* {{Icode|filterValue}}: filter parameter. Range [0.0, 1.0]
 
* {{Icode|filterValue}}: filter parameter. Range [0.0, 1.0]
 +
* {{Icode|source}}: pipeline {{Icode|Image}}source.
 +
* {{Icode| effects}}:  pipeline {{Icode|FilterEffect}}.
 +
* {{Icode|renderer}}: pipeline Renderer. Use {{Icode|outputBitmap}} as target.
 +
* {{Icode| solarizeFilter}}: first filter applied by {{Icode|effects}}.
 +
* {{Icode| temperatureAndTintFilter}}: second filter applied by {{Icode|effects}}.
 +
 +
 +
And three functions:
 
* {{Icode|requestProcessing()}}: A parameter is updated.
 
* {{Icode|requestProcessing()}}: A parameter is updated.
* {{Icode|processRendering()}}: '''process asynchronously''' the rendering to the {{Icode|Image}} control.
+
* {{Icode|processRendering()}}: '''process asynchronously''' the rendering.
 
* {{Icode|processFinished()}}: Process rendering is finished. Called at the end of {{Icode|processRendering()}} function.
 
* {{Icode|processFinished()}}: Process rendering is finished. Called at the end of {{Icode|processRendering()}} function.
  
 
+
Pipeline is create once. {{Icode| solarizeFilter}} and {{Icode| temperatureAndTintFilter}} parameters are updated before each rendering. The {{Icode|processRendering()}} function is similar to the one used in the naive method described above, but with {{Icode|processFinished()}} called at the end and the pipeline reused.
The {{Icode|processRendering()}} function is similar to the one used in the naive method described above, but with {{Icode|processFinished()}} called at the end and the {{Icode|EditingSession}} reused.
+
 
<code csharp>
 
<code csharp>
 
async void processRendering()
 
async void processRendering()
 
{
 
{
try
+
    try
{
+
    {
if (output != null && session != null)
+
        if (output != null && source != null)
{
+
        {
//reset filter pipeline
+
            //update filters parameters
session.UndoAll();
+
            solarizeFilter.Threshold = filterValue;
//create filter pipeline
+
            temperatureAndTintFilter.Temperature = 1.0 - 2.0 * filterValue;
session.AddFilter(FilterFactory.CreateSolarizeFilter(filterValue));
+
session.AddFilter(FilterFactory.CreateTemperatureAndTintFilter((int)(100 - 200 * filterValue), 0));
+
  
await session.RenderToImageAsync(output, OutputOption.PreserveAspectRatio);
+
            //start rendering
}
+
            await renderer.RenderAsync();
}
+
            outputBitmap.Invalidate();                
catch (Exception)
+
        }
{
+
    }
}
+
    catch (Exception)
finally
+
    {
{
+
    }
processFinished();
+
    finally
}
+
    {
 +
        processFinished();
 +
    }
 
}
 
}
 
</code>
 
</code>
  
To Implement the interactive State Machine, we represent :
+
To implement the ''Interactive State Machine'', we represent :
* States with the enum {{Icode|STATE}}.  
+
* states with the enum {{Icode|STATE}}.  
 
* the active state with the {{Icode|currentState}} member.
 
* the active state with the {{Icode|currentState}} member.
 
<code csharp>
 
<code csharp>
Line 293: Line 337:
 
}
 
}
 
</code>
 
</code>
{{Note| When SCHEDULE state is active, we need to save parameters. In this implementation we use the last updated parameters. Since parameters are the class properties, we don't need to save it again.}}
+
{{Note|When SCHEDULE state is active, we need to save parameters. In this implementation we use the last updated parameters. Since parameters are the class properties, we don't need to save it again.}}
{{Warning| This implementation is not thread safe like slider event are called in UI thread. If you have a parameter which can be updated by another thread, you can use Deployment.Current.Dispatcher.BeginInvoke to move param update to UI Thread.}}
+
{{Warning|This implementation is not thread safe as sliders event are called in UI thread. If you have a parameter which can be updated by another thread, you can use {{Icode|Deployment.Current.Dispatcher.BeginInvoke()}} to move parameter update to the UI Thread.}}
  
 
=== Optimizing rendering duration (optional) ===
 
=== Optimizing rendering duration (optional) ===
Line 318: Line 362:
 
* {{Icode|bitmapLR}}: low resolution bitmap
 
* {{Icode|bitmapLR}}: low resolution bitmap
  
 
+
Full resolution is the optimized dimension. Low resolution is simply the full resolution divided by a factor.  
Full resolution must be the output dimension in real pixels. Since .NET control dimensions are in logical pixels these must be converted to real pixels. To convert logical pixel dimension to real pixel dimension you only need the factor given by [http://msdn.microsoft.com/en-us/library/windowsphone/develop/system.windows.interop.content.scalefactor(v=vs.105).aspx {{Icode|System.Windows.Application.Current.Host.Content.ScaleFactor }}] :
+
<code csharp>
+
    bitmapHR = new WriteableBitmap(
+
        (int)(output.Width  * System.Windows.Application.Current.Host.Content.ScaleFactor / 100.0f + 0.5f),
+
        (int)(output.Height * System.Windows.Application.Current.Host.Content.ScaleFactor / 100.0f + 0.5f)
+
    );
+
</code>
+
 
+
Low resolution is simply the full resolution dived by a factor.  
+
 
<code csharp>
 
<code csharp>
 
     bitmapLR = new WriteableBitmap(
 
     bitmapLR = new WriteableBitmap(
Line 335: Line 370:
 
</code>
 
</code>
  
Unfortunately, when a bitmap is set as {{Icode|Image}} control source, a concurrent access between SDK and Image control appeared (see [[#Another correction]]). To fix this we added a temporary {{Icode|WriteableBitmap}} as the output target. After rendering, we copy this to the {{Icode|Image}} bitmap that is displayed.
+
 
<code csharp>
+
{{Note|Unfortunately, when a bitmap is set as {{Icode|Image}} control source, a concurrent access between SDK and Image control appeared (see [[#Another correction]])}}
public async Task RenderToBitmapAsync(EditingSession session)
+
{
+
    // process rendering to the temporary WriteableBitmap
+
    await session.RenderToWriteableBitmapAsync(_tmpBitmap, OutputOption.PreserveAspectRatio);
+
    //Copy pixels
+
    _tmpBitmap.Pixels.CopyTo(_previewBitmap.Pixels,0);
+
    _previewBitmap.Invalidate(); // Force a redraw
+
}
+
</code>
+
  
 
To finish, we must know when the application must use low resolution or high resolution. If you use a slider, you can process low resolution rendering between its {{Icode|ManipulationStarted}} and {{Icode|ManipulationCompleted}} events.
 
To finish, we must know when the application must use low resolution or high resolution. If you use a slider, you can process low resolution rendering between its {{Icode|ManipulationStarted}} and {{Icode|ManipulationCompleted}} events.
Line 361: Line 387:
 
{{Note|when rendering resolution change, requestProcessing is called. }}
 
{{Note|when rendering resolution change, requestProcessing is called. }}
  
When {{Icode|processRendering()}} is called, we can now choose between low resolution and high resolution rendering.
+
As a {{Icode|IImageProvider}} can be shared, we create a {{Icode|WriteableBitmapRendeder}} for each resolution and use the same first pipeline step '''Image source => FilterEffect'''.  When {{Icode|processRendering()}} is called, we can now choose between low resolution and high resolution renderer.
  
Sample code implement this solution in '''InteractiveNokiaImagingFilter2.cs''' file which update  '''InteractiveNokiaImagingFilter.cs''' with rendering time optimization.
+
Sample code implement this solution in '''InteractiveNokiaImagingFilter2.cs''' file which updates '''InteractiveNokiaImagingFilter.cs''' with rendering time optimization.
  
 
As stated in the beginning of this section, the benefit of this optimization will depend on a number of factors:
 
As stated in the beginning of this section, the benefit of this optimization will depend on a number of factors:
Line 369: Line 395:
 
* With a 41Mp picture, rendering time is constant 90ms.  
 
* With a 41Mp picture, rendering time is constant 90ms.  
 
* With another example using gestures for picture navigation with a 41Mp picture and half resolution, rendering time decreased from 500-160 ms to 180-60 ms.
 
* With another example using gestures for picture navigation with a 41Mp picture and half resolution, rendering time decreased from 500-160 ms to 180-60 ms.
 +
 +
=== Another correction ===
 +
When you use only one {{Icode|WriteableBitmap}} as {{Icode|Image}} source, there is concurrent access between the Imaging SDK thread and the UI, so when the user moves the slider it is possible for the image to partially display the result from a number of slider positions. The snapshot below shows this phenomena - the top and bottom part of the displayed image (separated by a red line) are the result of different parameters.
 +
 +
[[File:FilterEffect_ concurrent_problem.jpg|301px|thumb|none|Concurrent access phenomena when user move a slider]]
 +
 +
Using the state machine removes parallel rendering of images, but the concurrent access between SDK and Image control can still result in the Image showing the result of a number of renderings. To fix this you can use a temporary {{Icode|WriteableBitmap}} as the output target. After rendering, I copy this to the Image bitmap that is displayed.
 +
<code csharp>
 +
    // process rendering to the temporary WriteableBitmap
 +
    await renderer.RenderAsync();
 +
    //Copy pixels
 +
    _tmpBitmap.Pixels.CopyTo(_previewBitmap.Pixels,0);
 +
    _previewBitmap.Invalidate(); // Force a redraw
 +
</code>
  
 
==Sample code==
 
==Sample code==
Line 377: Line 417:
 
|}
 
|}
  
Sample show interactivity between the two methods and with optimized rendering duration :
+
The example app allows you to interactively view the difference between the two methods in terms of memory use and with optimized rendering duration :
 
# Run application in release.
 
# Run application in release.
 
# Click image icon bar and select a picture.
 
# Click image icon bar and select a picture.
Line 390: Line 430:
 
{{Note|To increase the difference, you can test a 41 MP picture taken with Nokia Lumia 1020 or Nokia Pureview 808.}}
 
{{Note|To increase the difference, you can test a 41 MP picture taken with Nokia Lumia 1020 or Nokia Pureview 808.}}
  
== Improved Filter Effects SDK sample ==
+
== Filter Effects SDK sample ==
  
The Nokia Imaging SDK example  [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/sample-projects/filter-effects.html Filter Effects] can be installed from [http://www.windowsphone.com/s?appid=7fa6838e-32c9-44de-b807-09f6ed07fbce Windows Phone Store here]. This sample applies filters on a picture using a slider. All filters are based on {{Icode|AbstractFilter}} class, which implements the naive method described here, with all its problems.
+
The Nokia Imaging SDK example  [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/sample-projects/filter-effects.html Filter Effects] can be installed from [http://www.windowsphone.com/s?appid=7fa6838e-32c9-44de-b807-09f6ed07fbce Windows Phone Store here]. Since Nokia Imaging SDK 1.0, this sample has integrated the Interactive State Machine. The documentation for this example provides a basic explanation of how to handle [http://developer.nokia.com/Resources/Library/Lumia/#!nokia-imaging-sdk/sample-projects/filter-effects.html;#toc_Modifyingfilterpropertiesonthefly modifying filter properties on the fly].
 
+
To improve this sample, I've modified {{Icode|AbstractFilter}} class with Interactive State Machine. You can find modified code to test here: [[Media:FilterEffects_InteractiveStateMachine.zip]].
+
 
+
=== Another correction ===
+
 
+
The updated version of the {{Icode|AbstractFilter}} class has another correction. The sample uses the same {{Icode|WriteableBitmap}} as {{Icode|Image}} source and for output rendering. There is concurrent access between the Imaging SDK thread and the UI, so when the user moves the slider it is possible for the image to partially display the result from a number of slider positions. The snapshot below shows this phenomena - the top and bottom part of the displayed image (separated by a red line) are the result of different parameters.
+
 
+
[[File:FilterEffect_ concurrent_problem.jpg|301px|thumb|none|Concurrent access phenomena when user move a slider]]
+
 
+
Using the state machine removes parallel rendering of images, but the concurrent access between SDK and Image control can still result in the Image showing the result of a number of renderings. To fix this I've added a temporary {{Icode|WriteableBitmap}} as the output target. After rendering, I copy this to the Image bitmap that is displayed.
+
<code csharp>
+
public async Task RenderToBitmapAsync(EditingSession session)
+
{
+
    // process rendering to the temporary WriteableBitmap
+
    await session.RenderToWriteableBitmapAsync(_tmpBitmap, OutputOption.PreserveAspectRatio);
+
    //Copy pixels
+
    _tmpBitmap.Pixels.CopyTo(_previewBitmap.Pixels,0);
+
    _previewBitmap.Invalidate(); // Force a redraw
+
}
+
</code>
+
  
 
==Real example==
 
==Real example==
Line 418: Line 438:
 
[[File:monstercam201292713419.png|301px|thumb|left|link=http://www.windowsphone.com/s?appid=8ac6c849-7b2f-4fa4-9be4-7e9d5f3e46a2|Monster Cam Tag|150px]]
 
[[File:monstercam201292713419.png|301px|thumb|left|link=http://www.windowsphone.com/s?appid=8ac6c849-7b2f-4fa4-9be4-7e9d5f3e46a2|Monster Cam Tag|150px]]
 
''MonsterCam'' is one of first applications based on the Imaging SDK, and offers:
 
''MonsterCam'' is one of first applications based on the Imaging SDK, and offers:
* Real time ROI extraction with gestures (will be explained in a future article)
+
* Reframing with gestures.
 
* Applying effects with user control.
 
* Applying effects with user control.
 
* ''Monsterification'' is done with DirectX.
 
* ''Monsterification'' is done with DirectX.

Latest revision as of 01:08, 26 November 2013

Note.pngNote: This is an entry in the Nokia Imaging and Big UI Wiki Competition 2013Q4.

Featured Article
29 Sep
2013

This article explains how to use the Nokia Imaging SDK efficiently during rapid user interaction.

See Also

Modifying filter properties on the fly (Lumia Developers' Library)
WP Metro Icon Graph1.png
SignpostIcon XAML 40.png
WP Metro Icon WP8.png
Article Metadata
Code ExampleTested with
SDK: Windows Phone 8.0 SDK
Devices(s): Lumia 920, 820, 620,
Compatibility
Platform(s):
Windows Phone 8
Dependencies: Nokia Imaging SDK 1.0
Article
Created: yan_ (30 Jul 2013)
Last edited: hamishwillee (26 Nov 2013)

Contents

[edit] Introduction

Applying image effects with the Nokia Imaging SDK is both easy and efficient. However there are use-cases where a naive application of the filters can result in unnecessary memory use and processing, which can in turn result in less-than-smooth UI behavior. One such example is when a user connects image processing code directly to the value of a slider, and generates a new version of the image on every slider event.

This article provides an an overview of filter use and "limitations", an explanation of how they can be misused (taking the slider case as an example) and provides a state machine that can be used to handle the user interaction more effectively.

[edit] Pre-requisites

A basic understanding of how the Imaging SDK is used is recommended (but not essential). The links below provide a good starting point:


[edit] Imaging SDK

To manipulate a picture, the Imaging SDK defines two interfaces :

  • IImageProvider: Input class which provides pixels.
  • IImageComsumer: Output class which consumes pixels provided by a IImageProvider.


These interfaces are used to make a pipeline between an Image source and a renderer:

Typical pipeline (Equivalent to EditingSession from the beta version of the Imaging SDK)


Image sources implement IImageProvider:

  • BitmapImageSource: Picture is decoded in a Bitmap instance.
  • BufferImageSource: Picture is an encoded file loaded in a IBuffer.
  • RandomAccessStreamImageSource: access the encoded Picture file with a IRandomAccessStream.
  • StorageFileImageSource: access the encoded Picture file with a IStorageFile.
  • StreamImageSource: access the encoded Picture file with a System.IO.Stream.
  • etc...

Note.pngNote: The SDK can read Jpeg, Png, Gif , Bmp, Wbmp and Tiff files

To work with high resolution pictures, you must use a IImageProvider which uses the image file (the SDK will decode only the pixels it needs).

Renderers implement IImageComsumer:

  • BitmapRenderer: render to a Nokia.Graphics.Imaging.Bitmap
  • WriteableBitmapRenderer: render to a WriteableBitmapRenderer
  • JpegRenderer: encode the rendering to a JPG and return the file in an IBuffer


The SDK provides classes which implement the IImageComsumer and IImageProvider interface to modify pixels between an ImageSource and a Renderer :

  • FilterEffect: applies a collection of IFilter. This collection is accessible through the Filter property and filters will be applied in order.
  • CustomEffectBase: used to implement a custom filter.

Note.pngNote: when you use a FilterEffect, the IFilter collection is accessible with the Filter property. You can easily add, remove and move filters. These filters are applied in order.

IImageProvider can be shared between pipelines. For example, to create a different pipeline which uses the same Image source.

Warning.pngWarning: The rendering process is asynchronous, and you can't reuse an IImageProvider until processing completes.

[edit] Unmanaged resources

The Nokia Imaging SDK is a WinPRT component coded in C++ and allocating unmanaged resources. Since C# uses a Garbage Collector, you don't know when its classes, with their unmanaged resources, will be deleted. Application memory can grow quickly if developers do not take specific action to release them promptly.

For this reason, Image sources, FilterEffects and renderers implement the IDisposable interface. This interface indicates that you can de-allocate unmanaged resources by calling the Dispose() function.

To simplify IDisposable class use, C# provides the using keyword. This keyword is equivalent to try/finally where Dispose() is called in the finally section.

With using keyword
//define the pipeline :  source => effect => renderer
using (var source = new StreamImageSource(input))
using (var effect = new FilterEffect(source))
using (var renderer = new WriteableBitmapRenderer(effect, outputBitmap))
{
effect.Filters = new IFilter[]{ new CreateAntiqueFilter()};
await renderer.RenderAsync();
}//Dispose is called on source, effect and renderer.
Without using keyword
StreamImageSource source = null;
FilterEffect effect = null;
WriteableBitmapRenderer renderer = null;
try
{
//instantiate the pipeline
source = new StreamImageSource(input))
effect = new FilterEffect(source))
renderer = new WriteableBitmapRenderer(effect, outputBitmap))
 
effect.Filters = new IFilter[]{ new CreateAntiqueFilter()};
await renderer.RenderAsync();
}
finally
{
if(source != null) source.Dispose(); //call Dispose
source = null;
 
if(effect != null) source.Dispose(); //call Dispose
effect = null;
 
if(renderer != null) source.Dispose(); //call Dispose
renderer = null;
}

Please call Dipose() manually or use using keyword - depending on your application context. The using keyword is preferred if you want to apply an effect on your picture and then won't need to further re-use the Image source.

[edit] Render to an Image Control

When you want display the rendering result, it's better to use a WriteableBitmap which will not be scaled by the Image Control. .NET control dimensions are in logical pixels. To avoid scaling you must convert the Image control size in physical pixels and use it for your WriteableBitmap. To convert logical pixel dimensions to real pixel dimensions you need to get the factor between logical and physical pixels dimension. You could use the factor given by System.Windows.Application.Current.Host.Content.ScaleFactor. Unfortunately, Microsoft have decided that this value will be the same for 720P and 1080p screen. GDR3 gets the physical screen resolution with DeviceExtendedProperties.GetValue("PhysicalScreenResolution")). As a Windows Phone device always has the same width in logical dimension (480), it is simple to compute the correct factor and use a WriteableBitmap with an optimized size:

private double _ScreenToPixelFactor = 0;
private double ScreenToPixelFactor
{
get
{
if (_ScreenToPixelFactor == 0)
{
try
{
_ScreenToPixelFactor = ((System.Windows.Size)DeviceExtendedProperties.GetValue("PhysicalScreenResolution")).Width / 480;
}
catch (Exception)
{
_ScreenToPixelFactor = System.Windows.Application.Current.Host.Content.ScaleFactor / 100.0;
}
}
return _ScreenToPixelFactor;
}
}
 
...
var displayedBbitmap = new WriteableBitmap((int)(imageControl.ActualWidth * ScreenToPixelFactor), (int)(imageControl.ActualHeight * ScreenToPixelFactor));
imageControl.Source = displayedBbitmap ;

[edit] User interaction

It's possible to share an IImageProvider between several pipelines - note however that once rendering has started the IImageProvider cannot be reused until it completes. Applying image effects with the Nokia Imaging SDK is easy and efficient, so usually these limitations have little effect on app design.

There are however user interactions where a naive design can have an impact - for example if parameters can change rapidly a poor implementation will result in the creation of unnecessary pipelines for each new value.

The following sections explain this problem, and the solution, in the context of a slider controlling a filter parameter on an image. To simplify code, we use stream file picture as StreamImageSource input and a WriteableBitmapRender output. The WriteableBitmap size is optimized as explained in the previous section and displayed by an Image control.

[edit] Naive method

The Slider raises ValueChanged events when the user changes the slider position. A naive implementation would generate new images (and hence new filters) when the event is raised.

private void filterparam_ValueChanged(object sender, RoutedPropertyChangedEventArgs<double> e)
{
//reset stream position
input.Seek(0, SeekOrigin.Begin);
 
using (var source = new StreamImageSource(input))
using (var effect = new FilterEffect(source))
using (var renderer = new WriteableBitmapRenderer(effect, outputBitmap))
{
effect.Filters = new IFilter[]{
new SolarizeFilter(filterValue),
new TemperatureAndTintFilter(1 - filterValue, 0.0)};
 
await renderer.RenderAsync();
outputBitmap.Invalidate();
}
}

Warning.pngWarning: The pipeline can't be reused while renderer is processing. Therefore in this case a new pipeline is created for each new event.

Note.pngNote: To manipulate a picture, we use the file stream. So we need to reset its position before each pipeline creation.

When a user moves the slider, many events are raised. With the above implementation each event results in recreation of the session and filters and a new asynchronous (parallel!) process to apply the effect to the image. Not only is generation of the parallel images not necessary, but this approach has the following implications:

  • The app will use 100% of CPU and allocate a lot of unmanaged resources.
  • The application GUI can be affected, resulting in "jerky" rendering of the image.
  • Processing time is not constant and you don't know which are the real parameters to apply on the final displayed image.


[edit] Naive implementation code example

Sample code implementing this "naive" solution is in the NokiaImagingFilter.cs file.

This class is composed by three properties and one function:

  • Input: file stream
  • Output: target Image controls.
  • filterValue: filter parameter. Range [0.0, 1.0]
  • processRendering(): process the rendering to a WriteableBitmap.


When a property is modified, processRendering() function is called.

[edit] Interactive State Machine

The naive method should be corrected to remove unnecessary processing, render the image display smoothly, and ensure the final image uses the correct parameters.

The approach we use to improve naive method is to create a very simple state machine.

Interactive State Machine Diagram

Note.pngNote: Interactive State Machine is really simple to implement and you can easily adapt it for other context.

This State machine has three States:

  • WAIT: wait new parameters to process a new rendering.
  • APPLY: process the rendering.
  • SCHEDULE: save new parameters for the new rendering.


And a Transition can be caused by two events :

  • requestProcessing: parameters are updated. Request a new rendering processed.
  • processFinished: rendering process is finished.


When the user begins to move the slider:

  1. the parameter is updated => APPLY State become active => process the image using current parameter
  2. if the parameter is updated and a rendering process has not finished => SCHEDULE State become active and the parameter is saved
  3. when the rendering process is finished => APPLY State become active => process a new rendering

Note.pngNote: If the user continues to move the slider, the state machine will loop between APPLY and SCHEDULE to avoid unnecessary processing.

Once the user finishes moving the slider :

  • rendering process is finished and SCHEDULE State is active => APPLY State become active => process the image using saved parameter.
  • rendering process is finished and APPLY State is active => WAIT State become active => no more process is necessary.

Note.pngNote: As the SCHEDULE state saves the last parameters every time it is updated, rendering is always done using the correct/most recent parameters.

[edit] State machine sample code

Sample code implement this solution in InteractiveNokiaImagingFilter.cs file.

The class is composed of nine properties

  • Input: file stream
  • Output: Image controls.
  • outputBitmap : writeableBitmap displayed by the Output.
  • filterValue: filter parameter. Range [0.0, 1.0]
  • source: pipeline Imagesource.
  • effects: pipeline FilterEffect.
  • renderer: pipeline Renderer. Use outputBitmap as target.
  • solarizeFilter: first filter applied by effects.
  • temperatureAndTintFilter: second filter applied by effects.


And three functions:

  • requestProcessing(): A parameter is updated.
  • processRendering(): process asynchronously the rendering.
  • processFinished(): Process rendering is finished. Called at the end of processRendering() function.

Pipeline is create once. solarizeFilter and temperatureAndTintFilter parameters are updated before each rendering. The processRendering() function is similar to the one used in the naive method described above, but with processFinished() called at the end and the pipeline reused.

async void processRendering()
{
try
{
if (output != null && source != null)
{
//update filters parameters
solarizeFilter.Threshold = filterValue;
temperatureAndTintFilter.Temperature = 1.0 - 2.0 * filterValue;
 
//start rendering
await renderer.RenderAsync();
outputBitmap.Invalidate();
}
}
catch (Exception)
{
}
finally
{
processFinished();
}
}

To implement the Interactive State Machine, we represent :

  • states with the enum STATE.
  • the active state with the currentState member.
enum STATE
{
WAIT,
APPLY,
SCHEDULE
};
//Current State
STATE currentState = STATE.WAIT;

Transitions are managed by requestProcessing() and processFinished() functions. These functions update the active State and call processRendering when active State is APPLY.

void requestProcessing()
{
switch (currentState)
{
//State machine transition : WAIT -> APPLY
case STATE.WAIT:
currentState = STATE.APPLY;
//enter in APPLY STATE => apply the filter
processRendering();
break;
 
//State machine transition : APPLY -> SCHEDULE
case STATE.APPLY:
currentState = STATE.SCHEDULE;
break;
 
//State machine transition : SCHEDULE -> SCHEDULE
case STATE.SCHEDULE:
currentState = STATE.SCHEDULE;
break;
}
}
void processFinished()
{
switch (currentState)
{
//State machine transition : APPLY -> WAIT.
case STATE.APPLY:
currentState = STATE.WAIT;
break;
//State machine transition : SCHEDULE -> APPLY.
case STATE.SCHEDULE:
currentState = STATE.APPLY;
//enter in APPLY STATE => apply the filter
processRendering();
break;
}
}

Note.pngNote: When SCHEDULE state is active, we need to save parameters. In this implementation we use the last updated parameters. Since parameters are the class properties, we don't need to save it again.

Warning.pngWarning: This implementation is not thread safe as sliders event are called in UI thread. If you have a parameter which can be updated by another thread, you can use Deployment.Current.Dispatcher.BeginInvoke() to move parameter update to the UI Thread.

[edit] Optimizing rendering duration (optional)

This section explains an optimization, where the perceived performance is improved by trading off the output resolution for reduced calculation time.

Warning.pngWarning: This approach may or may not be useful in your particular application or use-case.
The Imaging SDK is very optimized, but unfortunately the SDK doesn't explain its optimizations. As a result, for some use-cases output resolution is an important factor, in others it may have no effect at all.

Imaging SDK rendering duration depends on a number of different factors, including:

  • Input type - encoded file or decoded buffer,
  • Input resolution,
  • Output resolution,
  • Filter(s) used,
  • Selected picture area,
  • SDK cache,
  • Hardware, memory, system
  • etc


This section explores the effects of output resolution on the rendering time for a number of different picture input resolutions. The example uses the context of the user moving the slider - rendering an intermediate "low resolution" picture much faster. To do it, we use two Writeablebitmap as Image source:

  • bitmapHR: full resolution bitmap
  • bitmapLR: low resolution bitmap

Full resolution is the optimized dimension. Low resolution is simply the full resolution divided by a factor.

    bitmapLR = new WriteableBitmap(
bitmapHR.PixelWidth / 2,
bitmapHR.PixelHeight / 2
);


Note.pngNote: Unfortunately, when a bitmap is set as Image control source, a concurrent access between SDK and Image control appeared (see #Another correction)

To finish, we must know when the application must use low resolution or high resolution. If you use a slider, you can process low resolution rendering between its ManipulationStarted and ManipulationCompleted events.

 private void slider_ManipulationStarted(object sender, System.Windows.Input.ManipulationStartedEventArgs e)
{
interactiveFilter2.HRrendering = false;
}
 
private void slider_ManipulationCompleted(object sender, System.Windows.Input.ManipulationCompletedEventArgs e)
{
interactiveFilter2.HRrendering = true;
}

Note.pngNote: when rendering resolution change, requestProcessing is called.

As a IImageProvider can be shared, we create a WriteableBitmapRendeder for each resolution and use the same first pipeline step Image source => FilterEffect. When processRendering() is called, we can now choose between low resolution and high resolution renderer.

Sample code implement this solution in InteractiveNokiaImagingFilter2.cs file which updates InteractiveNokiaImagingFilter.cs with rendering time optimization.

As stated in the beginning of this section, the benefit of this optimization will depend on a number of factors:

  • With an 8Mp picture rendering time decreases from 120ms to 90ms
  • With a 41Mp picture, rendering time is constant 90ms.
  • With another example using gestures for picture navigation with a 41Mp picture and half resolution, rendering time decreased from 500-160 ms to 180-60 ms.

[edit] Another correction

When you use only one WriteableBitmap as Image source, there is concurrent access between the Imaging SDK thread and the UI, so when the user moves the slider it is possible for the image to partially display the result from a number of slider positions. The snapshot below shows this phenomena - the top and bottom part of the displayed image (separated by a red line) are the result of different parameters.

Concurrent access phenomena when user move a slider

Using the state machine removes parallel rendering of images, but the concurrent access between SDK and Image control can still result in the Image showing the result of a number of renderings. To fix this you can use a temporary WriteableBitmap as the output target. After rendering, I copy this to the Image bitmap that is displayed.

    // process rendering to the temporary WriteableBitmap
await renderer.RenderAsync();
//Copy pixels
_tmpBitmap.Pixels.CopyTo(_previewBitmap.Pixels,0);
_previewBitmap.Invalidate(); // Force a redraw

[edit] Sample code

Naive method
Interactive State Machine

The example app allows you to interactively view the difference between the two methods in terms of memory use and with optimized rendering duration :

  1. Run application in release.
  2. Click image icon bar and select a picture.
  3. Select a pivot page :
    • Naive page use the naive method,
    • Interactive page use Interactive State Machine,
    • Interactive2 page use Interactive State Machine with optimized rendering duration.
  4. Move the slider.

The Interactive State Machine provides better interaction with the user.

Note.pngNote: To increase the difference, you can test a 41 MP picture taken with Nokia Lumia 1020 or Nokia Pureview 808.

[edit] Filter Effects SDK sample

The Nokia Imaging SDK example Filter Effects can be installed from Windows Phone Store here. Since Nokia Imaging SDK 1.0, this sample has integrated the Interactive State Machine. The documentation for this example provides a basic explanation of how to handle modifying filter properties on the fly.

[edit] Real example

The techniques described in this article have been used in my commercial "monsterification" (monster image editing) app: MonsterCam.

Monster Cam Tag

MonsterCam is one of first applications based on the Imaging SDK, and offers:

  • Reframing with gestures.
  • Applying effects with user control.
  • Monsterification is done with DirectX.

This application provide an unlimited trial version (You don't need to pay) and can be found on Windows Phone Store here

[edit] Reference

This page was last modified on 26 November 2013, at 01:08.
635 page views in the last 30 days.
×