Please note that as of October 24, 2014, the Nokia Developer Wiki will no longer be accepting user contributions, including new entries, edits and comments, as we begin transitioning to our new home, in the Windows Phone Development Wiki. We plan to move over the majority of the existing entries over the next few weeks. Thanks for all your past and future contributions.
How to build a multi-touch control for Windows Phone
This article is an introduction on how to develop multi-touch XAML controls for Windows Phone applications
Windows Phone is already a multi-touch device but for some reason (I guess it is legacy of Silverlight) the standard XAML controls don’t have a real multi-touch behaviour. Real multi-touch means that I should be able to interact with two different objects at the same time with multiple fingers. What happens now for the XAML standard controls is that once a control start receiving touch events (ManipulationStarted, ManipulationEnded, ManipulationDelta) all the other controls will not receive touch events. The easiest test to do is add a standard button on the page, put one finger on the screen outside the button and try to press the button with another finger. You will see that the button does not respond to your commands. For XAML applications this might not be a problem, but for games it becomes one. Xaml (I mean Xaml+C# or XAML+VB.NET) is fast to develop easy games.
The solution is to build your own control and use Touch.FrameReported to “drive” it. In this sample I will build a multi-touch button. I will call it ButtonEx (some of you remember OpenNETCF library ?) and I will just add three events to it: TouchDown, TouchUpInside, TouchUpOutSide (iOS MonoTouch event names). With this three events I should have better control (Click in reality is a TouchUpInside event) .
So I've created a new Windows Phone Class Library called ControlsEx and I added the control ButtonEx derived from ContentControl. I copied the standard style of the Button control (you can easily generate it from a standard button using Blend and Edit Copy command on the Button Template). I've then added the style to the /Themes/generic.xaml file inside our project. When I create the control I will subscribe Loaded and Unloaded events as I want to start receiving Touch events when the control loads and unsubscribe the Touch events when the control gets unloaded.
DefaultStyleKey = typeof(ButtonEx);
this.Loaded += ButtonEx_Loaded;
this.Unloaded += ButtonEx_Unloaded;
IsEnabledChanged += ButtonEx_IsEnabledChanged;
IsPressed = false;
void ButtonEx_Loaded(object sender, RoutedEventArgs e)
Touch.FrameReported += Touch_FrameReported;
void ButtonEx_Unloaded(object sender, RoutedEventArgs e)
Touch.FrameReported -= Touch_FrameReported;
Now everything we need “happens” inside the Touch_FrameReported method. For my button I am interested to trace only one finger(using its id) from TouchAction.Down until TouchAction.Up. Once the first finger is down on the surface of my control I memorize the id and track it’s actions till it leaves the screen. Depending of the control that you are building you might have to take in consideration multiple fingers. One thing that is pretty important when starting to track a finger is to see if your control is in front or not (imagine an MessageBox over your controls and when you press the Ok button you will also press the button which is in the back). To resolve this issue I’ve used TouchDevice.DirectlyOver property of the TouchPoint and the VisualTreeHelper to see if the UIElement returned by DirectlyOver is a member of my control or not.
bool IsControlChild(DependencyObject element)
DependencyObject parent = element;
while ((parent != this) && (parent != null))
if (parent == this)
Here is the complete Touch_FrameReported method:
void Touch_FrameReported(object sender, TouchFrameEventArgs e)
if (Visibility == Visibility.Collapsed)
TouchPointCollection pointCollection = e.GetTouchPoints(this);
for (int i = 0; i < pointCollection.Count; i++)
if (idPointer == -1)
if (IsEnabled&&(Visibility==Visibility.Visible) && (pointCollection[i].Action == TouchAction.Down) && IsControlChild(pointCollection[i].TouchDevice.DirectlyOver))
//start tracing this finger
idPointer = pointCollection[i].TouchDevice.Id;
IsPressed = true;
if (TouchDown != null)
else if ((pointCollection[i].TouchDevice.Id == idPointer) && (pointCollection[i].Action == TouchAction.Up))
IsPressed = false;
if ((pointCollection[i].Position.X > 0 && pointCollection[i].Position.X < ActualWidth) && (pointCollection[i].Position.Y > 0 && pointCollection[i].Position.Y < ActualHeight))
if (TouchUpInside != null)
if (TouchUpOutside != null)
For the button control we don’t have to trace the movements of the finger until Up action but we might need to if we are writing a Slider control for example. The sample application uses 2 ButtonEx controls and a standard Button control. The ButtonEx should always respond to your commands (fingers).
If you would like to see a full multi-touch application developed using the described method have a look at BeeWii Windows Phone application.
The example code can be downloaded from here: File:ControlsEx.zip
- How to build a multi-touch control for Windows Phone (Original blog)