The Series 40 Touch and Type UI phones support a number of platform defined gestures.


Interface Summary
GestureEvent The GestureEvent interface class is used by an application to receive gesture recognition events from the platform.
GestureListener This interface is used by applications which need to receive gesture events from the implementation.

Class Summary
GestureInteractiveZone The GestureInteractiveZone class is used by an application to define an area of the screen that reacts to a set of specified gestures.
GestureRegistrationManager The GestureRegistrationManager class provides the ability to register a GestureListener to be notified when a gesture event occurs within a container.

Package Description

The Series 40 Touch and Type UI phones support a number of platform defined gestures. These are Single tap, Long tap, Drag and drop, Flick and Pinch. In addition, virtual gestures for recognition start and recognition end are supported.

The Single tap is recognised by a quick touch down and release.
The Long press is a touch and hold.
The Long press repeated is a generated when a long press is held down.
Drag and drop are defined as touch down, move the finger whilst keeping contact with the touch screen, stop and then release.
The Flick gesture is defined as a touch down, move and release before stopping the finger movement.
The Pinch gesture is defined as a touch down for the first finger, a touch down for the second finger and moving finger(s) whilst keeping contact with the touch screen, stop and then release both fingers. It is also possible to release either finger and keep dragging the remaining finger.
Gesture recognition start is the initial state of the gesture recognition. It's recognised by touching the screen.
Gesture recognition end is the final state of the gesture recognition. It's recognised by releasing the finger.

The Nokia Gesture API makes available to the MIDlet the platformís gesture recognition engine. This simplifies MIDlet development because the MIDlet doesn't need to implement its own gesture recognition engine. It also helps to ensure the MIDletís user experience matches that of the native Series 40 Touch and Type UI platform.

The Gesture API uses the Observer design pattern. To use this API MIDlets must first create a GestureInteractiveZone. This defines a bounding rectangle for the Gesture event notifications. By default the bounding rectangle is the entire screen. Only Gesture events that are initiated within the confines of the zone are passed to the MIDlet. The GestureInteractiveZone also defines the types of Gesture events to register for.

// Defines a GestureInteractiveZone for the whole screen and all Gesture types.
GestureInteractiveZone giz = new GestureInteractiveZone( GestureInteractiveZone.GESTURE_ALL );

This zone is then registered with the GestureRegistrationManager by passing in the container (either a Canvas or CustomItem) and the GestureInteractiveZone.

// Register the GestureInteractiveZone for my Canvas.
GestureRegistrationManager.register( canvas, giz );

The MIDlet must then define a class that implements the GestureListener interface. This interface defines a single method, gestureAction, which gets called when the platform's gesture recognition engine detects a gesture in one of the registered GestureInteractiveZones. The gestureAction method will receive a GestureEvent instance each time it is called. This GestureEvent holds the properties of the recently recognized gesture such as the type (TAP, DRAG, etc). For all event types the MIDlet can get the x and y location. For DRAG and DROP events the MIDlet can also get the change in x and y distance since the last Drag event. For FLICK events the MIDlet can get the flick speed and direction. For Pinch events the MIDlet can get starting and current distance in between the fingers, distance change between the fingers since the last Pinch gesture, center position between the fingers in horizontal and vertical directions and change in the center position between the fingers in horizontal and vertical directions since the last Pinch gesture.

Gesture recognition start and gesture recognition end are virtual gestures that are used for keeping track of gesture recognition by individual finger states (press and release). MIDlets can, for instance, implement continuous zoom feature by registering a GestureInteractiveZone with Pinch and gesture recognition end gestures.

public void gestureAction(Object container,
GestureInteractiveZone gestureZone,
GestureEvent gestureEvent)
    // TODO: add custom code here.


Copyright © 2012 Nokia Corporation. All rights reserved.

Nokia is registered trademark of Nokia Corporation. Java and all Java-based marks are trademarks or registered trademarks of Oracle Corporation. Other product and company names mentioned herein may be trademarks or trade names of their respective owners. This document is confidential information of Nokia Corporation.

The information in this document is provided "as is," with no warranties whatsoever, including any warranty of merchantability, fitness for any particular purpose, or any warranty otherwise arising out of any proposal, specification, or sample. Furthermore, information provided in this document is preliminary, and may be changed substantially prior to final release.

Nokia Corporation disclaims all liability, including liability for infringement of any proprietary rights, relating to this document and implementation of any information presented in this document.

Nokia Corporation retains the right to make changes to this document at any time, without notice.

Subject to above disclaimer, a license is hereby granted to use this documentation solely under existing Limited License Agreement and non-disclosure agreement between the companies for the agreed application development for Series 40 Nokia phones. No other licenses e.g. to any intellectual property rights are granted herein. Any use of the screen shots of this documentation, including any icons thereof, is subject to Nokia's prior written approval.