Please note that as of October 24, 2014, the Nokia Developer Wiki will no longer be accepting user contributions, including new entries, edits and comments, as we begin transitioning to our new home, in the Windows Phone Development Wiki. We plan to move over the majority of the existing entries. Thanks for all your past and future contributions.

Revision as of 03:11, 13 February 2012 by hamishwillee (Talk | contribs)

Sensor based interactions with home screen Web Runtime widgets

From Wiki
Jump to: navigation, search
Article Metadata
Created: jappit (25 Jun 2009)
Last edited: hamishwillee (13 Feb 2012)

This article proposes some sensor-based interaction patterns to be used with home screen Web Runtime widgets.



Starting with Nokia N97, Web Runtime offer the possibility to add widgets to the device home screen. Home screen widgets allow users to view data from multiple widgets without the need to open them in full screen mode.

User interaction

Current home screen widgets do not allow direct user interaction, as explained in this Nokia Developer Library page. When the user clicks on a home screen widget, the full screen version of the widget is opened.

Sensor based interaction

This sections shows two possible way of interactions with home screen widgets, that use the inbuilt device's accelerometer and the JavaScript Sensor Service API, available starting from WRT 1.1.

Sensor based interaction patterns

This sections discusses some possible patterns of interaction that can be used to let users interact with home screen widgets, without the need of touch-based interactions. In order to implement the following patterns, the JavaScript Sensor Service API must be used. Complete code examples on how to use Sensors in WRT widgets can be found on the following pages:

Shake pattern

Inbuilt accelerometers allow to detect device's motions, by measuring the device acceleration in the three spatial directions. By using this data, it is possible to identify when the device is shaken, by checking for fast and repetitive motions. The sensor channel used to retrieve the acceleration values is the AccelerometerAxis (more information here:

Wrt shakepattern.png

The shake pattern could be used for several purposes, depending on the specific home screen widget:

  • if a widget shows data retrieved from a remote host, the shake actions could be used to force a refresh of the data presented to the user. This could allow to avoid automatic and unnecessary refreshes, so minimizing network traffic.
  • if a widget shows multiple items (e.g.: photos, news), the shake movements can be used to view the next available items.

Flip pattern

Sensors can be used to detect the current orientation of the phone, by using the Orientation sensors' channel. Depending on the device orientation, it is so possible to let WRT widgets behave differently.

Wrt flippattern.png

A possible use-case for this scenario is a widget that periodically retrieves data from a network host, so actually doing network traffic. In this situation, it could be useful to stop the automatic data refresh just flipping the device display downwards.

More generally, this pattern could be useful in all these situations where the user could be allowed to "stop" the widget's actions by simply down-flipping the mobile device (e.g.: during night).


Constraints to be considered when implementing sensor-based interaction patterns:

  • battery consumption: continuously monitoring accelerometer sensors impacts on battery life. Careful testing and tuning should so be performed in order to optimize the sensors' monitoring, without causing too much battery consumption.
  • unintentional device motions: home screen widgets are designed to be always active. For this reason, the sensors' monitoring should be designed in order to detect unintentional motions, in order to avoid unwanted widget's responses.
  • multiple home screen widgets: since home screen allows to use multiple widgets at the same time, users could want to disable sensor-based interaction for some of them, and so to choose to interact only with some widgets. So, an option to enable/disable sensor-based interaction should be given to the user.
48 page views in the last 30 days.