Sensor based interactions with home screen Web Runtime widgets
This article proposes some sensor-based interaction patterns to be used with home screen Web Runtime widgets.
Starting with Nokia N97, Web Runtime offer the possibility to add widgets to the device home screen. Home screen widgets allow users to view data from multiple widgets without the need to open them in full screen mode.
Current home screen widgets do not allow direct user interaction, as explained in this Forum Nokia Library page. When the user clicks on a home screen widget, the full screen version of the widget is opened.
Sensor based interaction
Sensor based interaction patterns
Inbuilt accelerometers allow to detect device's motions, by measuring the device acceleration in the three spatial directions. By using this data, it is possible to identify when the device is shaken, by checking for fast and repetitive motions. The sensor channel used to retrieve the acceleration values is the AccelerometerAxis (more information here: http://www.developer.nokia.com/Resources/Library/Web/)
The shake pattern could be used for several purposes, depending on the specific home screen widget:
- if a widget shows data retrieved from a remote host, the shake actions could be used to force a refresh of the data presented to the user. This could allow to avoid automatic and unnecessary refreshes, so minimizing network traffic.
- if a widget shows multiple items (e.g.: photos, news), the shake movements can be used to view the next available items.
Sensors can be used to detect the current orientation of the phone, by using the Orientation sensors' channel. Depending on the device orientation, it is so possible to let WRT widgets behave differently.
A possible use-case for this scenario is a widget that periodically retrieves data from a network host, so actually doing network traffic. In this situation, it could be useful to stop the automatic data refresh just flipping the device display downwards.
More generally, this pattern could be useful in all these situations where the user could be allowed to "stop" the widget's actions by simply down-flipping the mobile device (e.g.: during night).
Constraints to be considered when implementing sensor-based interaction patterns:
- battery consumption: continuously monitoring accelerometer sensors impacts on battery life. Careful testing and tuning should so be performed in order to optimize the sensors' monitoring, without causing too much battery consumption.
- unintentional device motions: home screen widgets are designed to be always active. For this reason, the sensors' monitoring should be designed in order to detect unintentional motions, in order to avoid unwanted widget's responses.
- multiple home screen widgets: since home screen allows to use multiple widgets at the same time, users could want to disable sensor-based interaction for some of them, and so to choose to interact only with some widgets. So, an option to enable/disable sensor-based interaction should be given to the user.