Please note that as of October 24, 2014, the Nokia Developer Wiki will no longer be accepting user contributions, including new entries, edits and comments, as we begin transitioning to our new home, in the Windows Phone Development Wiki. We plan to move over the majority of the existing entries. Thanks for all your past and future contributions.

Archived:Python on Symbian/11. Sensor Framework

From Wiki
Jump to: navigation, search

Archived.pngAquivado: Este artigo foi arquivado, pois o conteúdo não é mais considerado relevante para se criar soluções comerciais atuais. Se você achar que este artigo ainda é importante, inclua o template {{ForArchiveReview|escreva a sua justificativa}}.

All PySymbian articles have been archived. PySymbian is no longer maintained by Nokia and is not guaranteed to work on more recent Symbian devices. It is not possible to submit apps to Nokia Store.

Article Metadata
Code ExampleArticle
Created: hamishwillee (30 Nov 2010)
Last edited: hamishwillee (08 May 2013)

Original Author: Mike Jipping

The Symbian sensor framework provides access to sensor information on Symbian devices (most Nokia S60 3rd Edition, FP2 devices or later products). This chapter explains how to check for sensor support and monitor sensor data using the framework. It also gives some real-world examples of how you can use sensors to control your applications, and provides a very brief overview of the legacy sensor API used in earlier platforms.



Mobile devices may contain many sophisticated sensors including, but not limited to, accelerometers, light sensors, proximity sensors, magnetometers, and magnetic north sensors! These sensors may be used for any number of purposes, such as to turn off touch screen sensitivity when a user's face is close to the screen, to dim the display in dark room to save power, and to use gesture and motion control in an application. Applications are already using sensors in ways that were not imagined when they were first included in mobile devices.

The Symbian platform provides access to all sensors through a "sensor framework". The framework provides a common generic mechanism for applications to query the platform for sensor availability and to obtain sensor data. It can be extended using plug-ins as new sensors are added, ensuring that a consistent interface is available to applications moving forward.

This chapter explains how to use the Symbian sensor framework through Python. At the end of the chapter there is a brief discussion of the legacy sensor-specific APIs used in older versions of the Symbian/S60 platform.

Sensor channels

Python provides access to the Symbian platform's sensor framework through the sensor module.

The sensor framework (and module) makes sensor data available through "channels", where each channel represents a single type of information from a sensor. Some sensors will provide multiple channels; for example, an accelerometer can supply data to both a three-dimensional positioning channel and to a channel that detects double-taps on the screen.

In PySymbian v2.0.0, the following channels are currently supported:

  • Accelerometer XYZ sensor
  • Rotation sensor
  • Orientation sensor
  • Accelerometer double-tap sensor
  • Proximity monitor sensor
  • Ambient light sensor
  • Magnetic North sensor
  • Magnetometer XYZ sensor.

Programmers can use the list_channels() function to list all the channels available on a device. The function returns a list of dictionary objects, which contain channel information including:

  • id: system id of the channel
  • type: channel type, useful in access function calls
  • name: system name of the channel.

For example, a Python shell session on the Nokia N97 lists 10 channels:

>>> import sensor
>>> sensor.list_channels()
[{'type': 536929669L, 'id': 7L, 'name': 'ProximityMonitor'},
{'type': 536919830L, 'id': 8L, 'name': 'AmbientLightData'},
{'type': 270553214L, 'id': 9L, 'name': 'AccelerometerXYZAxisData'},
{'type': 270553217L, 'id': 10L, 'name': 'AccelerometerDoubleTappingData'},
{'type': 270553215L, 'id': 11L, 'name': 'TSensrvTappingData'},
{'type': 536919776L, 'id': 12L, 'name': 'MagnetometerXYZAxisData'},
{'type': 536957243L, 'id': 13L, 'name': None},
{'type': 536919775L, 'id': 14L, 'name': 'MagneticNorthData'},
{'type': 270553224L, 'id': 15L, 'name': 'OrientationData'},
{'type': 270553225L, 'id': 16L, 'name': 'RotationData'}]

The name of each channel is human-readable, for example, MagnetometerXYZAxisData is a magnetometer that provides three-dimensional spatial information. As you'll see in the rest of the chapter, I tend to use the name in our scripts rather than the type or id because this results in more readable code. Note too that not every sensor is useful - the one with name "None" doesn't do anything, and the AccelerometerDoubleTappingData sensor and the TSensrvTappingData provide the same information.

Checking sensor availability

The sensor framework is not present on every Symbian device that supports Python, and not every device where it is present has the same set of sensors/channels. Well-written applications will either restrict their installation to devices and platform that support the full set of functionality, or selectively enable functionality based on the presence of specific sensors. Applications can check for the presence of the sensor module as follows:

import sensor
except ImportError:

The easiest way to determine if a particular sensor is available is to iterate through the available channels (use list_channels()) and compare the channel name to the name of the required channel. The following code shows how this may be done:

def sensorPresent(testSensor):
sensors = sensor.list_channels()
for sense in sensors:
if sense['name'] == testSensor:
return True
return False

Channel classes and attributes

Each sensor channel is represented by a separate class. The classes have different attributes that are used to provide data from the sensor. For example, the accelerometer XYZ data channel class, called AccelerometerXYZAxisData, has x, y, and z attributes that give values along the X-, Y-, and Z-axes, respectively.

Table 2.1 lists each sensor channel, the name of the class the represents it, and the class attributes that are used to reference values of the sensor channel data.

Sensor Channel Class Name Class Attributes
Table 2.1:Sensor channels
Accelerometer XYZ sensor channel AccelerometerXYZAxisData x gives X-axis value

y gives Y-axis value
z gives Z-axis value

Accelerometer double-tap sensor channel AccelerometerDoubleTappingData direction gives the tap direction
Magnetometer XYZ sensor channel MagnetometerXYZAxisData x gives X-axis value

y gives Y-axis value
z gives Z-axis value
calib_level is an integer giving the level of calibration

  • 0 means the device is not calibrated
  • 1 indicates low calibration
  • 2 indicates a medium level of calibration
  • 3 indicates highly accurate calibration
Magnetic North sensor channel MagneticNorthData azimuth gives the degrees clockwise from magnetic north; values 0 to 359 are possible
Ambient light sensor channel AmbientLightData ambient_light gives the light level as an integer percentage value, ranging from 0 to 100:
  • 0 means very dark
  • 100 means sunny
Proximity monitor sensor channel ProximityMonitor proximity_state gives one of three values:
  • 0 means that the proximity is not determined
  • 1 means that the proximity of another object is not close
  • 2 means that the proximity of another object is close
Orientation sensor channel OrientationData
Figure 11.1 Labelling of sides on a phone
device_orientation gives the orientation of the device in a range of integer values from -1 to 6. The orientation is from the user's perspective, when the phone is held in "portrait" mode:
  • -1 means the sensor has not been initialized
  • 0 means the orientation is undefined
  • 1 means that the "display up" side is up
  • 2 means that the "display down" side is up
  • 3 means that the left side is up
  • 4 means that the right side is up
  • 5 means that the display itself is up
  • 6 means that the back of the device is up
Rotation sensor channel RotationData x gives X-axis value

y gives Y-axis value
z gives Z-axis value

Most of the information in the table is self-explanatory, although it's worth pointing out the following:

  • Several of the sensors give a three-dimensional result as (x, y, z) axis data. It is important to remember that each three-dimensional sensor is different. The "accelerometer XYZ sensor channel" gives data on the movement on each axis; the "magnetometer XYZ sensor channel" gives data about the geomagnetic fields on each axis - the "rotation sensor channel" - gives the degree of rotation about each axis.
  • Some sensors return integer data where a descriptive string might be more useful. For example, an ambient light level of 40 may not be as meaningful as "AmbientLightTwilight". The sensor.get_logicalname() function allows you to get the string for a particular value, as follows:
# Format is:  get_logicalname(<classLookupName>, <value>)
# For example
sensor.get_logicalname(sensor.SensrvAmbientLightData, 40)
  • There is more data available for the accelerometer double-tap sensor channel than just the "direction" value. The data is available from other functions in the AccelerometerDoubleTappingData class:
    • get_axis_active() returns a tuple of three axis activity indicators: a 0 (disabled) or a 1 (active) for each axis
    • {{{1}}} sets one or more axis as an active axis
    • get_properties() returns a tuple of values indicating the TapThresholdValue, TapDurationValue, TapLatencyValue, and TapIntervalValue variables.
    • {{{1}}} sets the properties of variables given.
The "active axis" determines how the double tap is determined. For example, if the X axis is turned off, but the Y and Z axes are turned on, then only changes in the Y and Z values will be used to determine double tapping.
These calls will be used and more deeply explained later in the section on gestures.
  • The AccelerometerXYZAxisData and RotationData class constructors take an optional parameter that gives the name of a function used to implement a noise filtering algorithm. The possible choices are MedianFilter() or LowPassFilter().
    • A median filter will select the middle value from a collection of data, and for data that is relatively close together it is a good general-purpose noise reducer.
    • A low pass filter will focus on values at the lower end of the data range and ignore high values. For noisy data that can produce wilder, out-of-range values, a low-pass filter smoothes out the signal and removes short spikes in the retrieved data.
    • Not specifying a filter in the constructor call will allow raw data to be retrieved from the sensor.

Monitoring sensor data

Sensor channels cannot be polled: you can't simply create a sensor channel object and query the current value of the data. Instead, the sensor framework is designed to be monitored and the data returned to the application in callbacks as it arrives.

The sequence of setting up and retrieving data is as follows:

  1. Initialize the sensor by creating an instance of the channel class
  2. Register a callback function by calling the channel object's set_callback() function
  3. Start monitoring the channel by calling the channel object's start_listening() function
  4. Collect data for some time period
  5. Stop retrieving data (i.e., stop using the callback function) by calling the stop_listening() function of the sensor object.

The following code shows how to use the ambient-light sensor channel class to monitor the light level and print its logical name:

light = sensor.AmbientLightData()
def reportTheLight():
global light
level = light.ambient_light
print "Light level is", level, "or", sensor.get_logicalname(sensor.SensrvAmbientLightData, level)

The output of the code is something like:

Light level is 60 or AmbientLightLight

A call to light.stop_listening() will stop the monitoring.

The following example shows how to monitor the orientation of the device (omitting the call to "side.stop_listening()" to stop monitoring):

side = sensor.OrientationData()
def reportOrientation():
global side
print sensor.get_logicalname(sensor.SensrvDeviceOrientation, side.device_orientation)

Now, if I start with the phone facing me in portrait orientation, then rotate it counter-clockwise, here is the output:


For the final example, we monitor the accelerometer coordinates as the phone is rotated (counter-clockwise):

meter = sensor.AccelerometerXYZAxisData(data_filter=sensor.LowPassFilter())
def reportXYZ():
global meter
print "(",meter.x,",",meter.y,",",meter.z,")"

The result is a stream of (x, y, z) coordinate data:

( -1, 6, 0 )
( -1, 11, 1 )
( -1, 16, 2 )
( -2, 23, 3 )
( -3, 29, 4 )
( -4, 34, 6 )
( -5, 40, 9 )
( -5, 46, 11 )
( -6, 51, 12 )
( -6, 57, 13 )
( -7, 57, 14 )
( -8, 58, 14 )
( -10, 59, 13 )
( -11, 59, 11 )
( -10, 59, 10 )
( -9, 58, 10 )
( -7, 57, 9 )
( -7, 57, 9 )
( -6, 57, 9 )
( -6, 56, 7 )
( -5, 56, 7 )
( -5, 57, 8 )
( -3, 55, 9 )
( 0, 54, 10 )
( 1, 54, 12 )
( 1, 56, 11 )
( 1, 57, 11 )
( 1, 58, 10 )
( 1, 58, 10 )
( 2, 58, 11 )
( 3, 58, 11 )

Obviously, my hand did not remain perfectly still along the Z axis, but let's ignore that for now. You can see that the movement of the phone started slowly along the X axis but was faster along the Y axis. If we plot all 211 points on a X-Y graph, we get the graph shown in Figure 11.2:

Figure 11.2 Plot of phone coordinates moving in space

You can see the phone roughly moved in a circle, except when I had to change hands, which slowed movement along one axis.

Sensor controlled bouncing ball

This section gives a more complex example. The example distributed with the PySymbian installation implements a ball dropping and bouncing on the screen in response to arrow keys. We modify this to control the bouncing ball using sensors.

The first question to ask is, "which sensors to use?". We want the ball to respond to movement of the phone, so we could use the rotation sensor or the accelerometer (other sensors like the ambient light sensor do not make sense for this use case). The rotation sensor responds to rotation in degrees around a single axis, and it will not render data that corresponds to movement along a single axis or give data that measures distances. So, the accelerometer appears to be the sensor of choice.

The following code provides the full solution. We examine this code in chunks afterwards.

import appuifw, e32
from sensor import *
from graphics import *
# Some initial initializations
X = 0
Y = 1
img = None
# Here, we define some callbacks.
# First, we define the callback for the accelerometer. We record the direction of the movement along
# X and Y axes.
def deviceMoved():
global accelerometer, prevX, prevY, dirx, diry
dirx = prevX - accelerometer.x
diry = prevY - accelerometer.y
prevX = accelerometer.x
prevY = accelerometer.y
# This is a callback for redrawing canvas graphics
def handleRedraw(rect):
if img:
# We ignore canvas events
def handleCanvasEvent(event):
# Application QUIT callback
def quit():
global running
# Now we set up the accelerometer object and start listening to the accelerometer
accelerometer = AccelerometerXYZAxisData(data_filter=LowPassFilter())
# Here we set up the application screen to be the canvas and full screen = 'full' = \
canvas = \
appuifw.Canvas(event_callback=handleCanvasEvent, redraw_callback=handleRedraw) = quit
# New (empty) image
img =
# Some final initializations before we begin. Ball is in the middle of the screen.
prevX = prevY = 0
location = [img.size[X]/2,img.size[Y]/2]
speed = [0.,0.]
blobsize = 16
width,height = img.size[X]-blobsize,img.size[Y]-blobsize
gravity = 0.03
acceleration = 0.1
frames = 0
dirx = 0
diry = 0
# Start things going. Stop the simulation when "running" = 0
running = 1
while running:
# Clear the screen, draw the ball.
# Yield active object (thread) execution for a moment
# Adjust the speed to slow a bit and to obey gravity. Then adjust the location.
speed[X] *= 0.999
speed[X] *= 0.999
speed[X] += gravity if dirx > 0 else -gravity
speed[Y] += gravity if diry > 0 else -gravity
location[X] += speed[X]
location[Y] += speed[Y]
# If we hit a wall, bounce back!
if location[X]>width:
if location[X]<0:
if location[Y]>height:
if location[Y]<0:
# Adjust the speed to reflect the movement of the accelerometer. Note we assume
# that the accelerometer callback has adjusted the direction variables
speed[X] += dirx*acceleration
speed[Y] -= diry*acceleration
# Only go for a finite number of iterations. Then stop.
frames += 1
if frames>400:
running = 0

The code starts by initializing some data, then it sets up callbacks for the accelerometer and the canvas. First let's look at the accelerometer callback:

def deviceMoved():
global accelerometer, prevX, prevY, dirx, diry
dirx = prevX - accelerometer.x
diry = prevY - accelerometer.y
prevX = accelerometer.x
prevY = accelerometer.y

The function records values to indicate direction of movement and saves values for the next call. It takes no arguments, instead using global variables to set and retain accelerometer values. As the callback will be called while the main code is executing, sharing access to these variables could potentially be a problem. As it is, this function is the only code that changes these values, so no concurrency issues need to be considered.

We set up the accelerometer with the following code, using a low pass filter to eliminate as much noise as possible and to smooth the data. We start listening to the sensor right away.

accelerometer = AccelerometerXYZAxisData(data_filter=LowPassFilter())

Once the sensor has data, the variables dirx and diry will change according to the movement of the phone. We use this change to control the speed along both axes:

speed[X] += dirx*acceleration
speed[Y] -= diry*acceleration

Previously, I noted that the deviceMoved() function executes concurrently with the main code in the program. It executes as an active object (in Symbian platform parlance) and interrupts the main code when it needs to execute. This interruption can cause the main code to slow down. It is wise to explicitly share the CPU with the active object. We do that in the code as follows:


This call yields execution back to the system and asks the system to wake up the thread when necessary, such as for a sensor event. The system treats the thread as a Symbian C++ active object, which is an efficient asynchronous method of handling system events. Without the yield, the thread will loop repeatedly, even when there is no event to process. If there is no event, the code will redraw the ball in the same location as the previous iteration, which will cause the code to slow down due to wasted graphics manipulation. This performance degradation can be dramatic: in one trial, code to execute 400 frames or iterations ran in 36 seconds using the yield statement and 65 seconds without it.

Detecting gestures

An interesting use of sensor data is to detect gestures: movements of a phone that can be used to control applications. For this example, we will focus on two gestures: a shake up and down and a shake by moving your wrist to the left and back.

To detect a shake, we start by observing gesture data to determine exactly what a shake gesture looks like in three dimensional space. The following code records accelerometer data into a file:

import time, sensor, e32
datafile = open("E:\Python\sensordata", "w")
meter = sensor.AccelerometerXYZAxisData(data_filter=sensor.LowPassFilter())
def reportXYZ():
global meter,datafile
line = "(%d,%d,%d):%s\n" % (meter.x,meter.y,meter.z,time.strftime("%H:%M:%S"))

This code will write three coordinates and a time-stamp to a file. The output is a series of lines that look like this:


By analysing the data, we discover that a downward shake takes between 1.5 and 2 seconds and varies across the Y axis by between 40 and 70 points. So we need code that records the Y coordinates with a time stamp. If, over that last 1.8 seconds, the maximum Y point minus the minimum Y point is greater than 40 AND we are within 5 points of 1.8 seconds ago, we have a shake along the Y axis.

Consider the following (note that this also looks for a shake along the X axis).

import time,sensor,e32
meter = sensor.AccelerometerXYZAxisData(data_filter=sensor.LowPassFilter())
start = time.time()
timeWindow = [start]
xaxisWindow = [0]
yaxisWindow = [0]
def recordEvents():
global meter,timeWindow,xaxisWindow,yaxisWindow
now = time.time()
while now - timeWindow[0] > 2:
timeWindow = timeWindow[1:]
xaxisWindow = xaxisWindow[1:]
yaxisWindow = yaxisWindow[1:]
if now - timeWindow[0] < 1.8: return
if xaxisWindow[0]-meter.x < 5 and max(xaxisWindow)-min(xaxisWindow)>40:
print "XSHAKE"
timeWindow = [now]
xaxisWindow = [meter.x]
if yaxisWindow[0]-meter.y < 5 and max(yaxisWindow)-min(yaxisWindow)>40:
print "YSHAKE"
timeWindow = [now]
yaxisWindow = [meter.y]

The code starts by initializing several lists:

  • timeWindow is a list of timestamps, initialized with the current time
  • xaxisWindow and yaxisWindow are lists of X coordinates and Y coordinates, respectively.

We do not keep them in (X,Y) pairs so we can use the min() and max() functions on each axis separately. The idea here is that each window should contain recorded data for 1.8 seconds. This is a moving window, so as time shifts, data shifts in and out of each list.

The recordEvents() function maintains these moving lists and evaluates if a shake has occurred. The function starts by simply recording the current time and (X,Y) coordinates:

now = time.time()

The function then shifts each list until we have approximately 2 seconds of data:

while now - timeWindow[0] > 2:
timeWindow = timeWindow[1:]
xaxisWindow = xaxisWindow[1:]
yaxisWindow = yaxisWindow[1:]

At this point, the data at the start of each list is less than 2 seconds from the data at the end of the list. If the time difference is less than 1.8 seconds we need to get more data. We simply return in this case and do not consider the data further.

If we have a valid collection of data, we simply need to apply our criteria to it:

if xaxisWindow[0]-meter.x < 5 and max(xaxisWindow)-min(xaxisWindow)>40:
print "XSHAKE"
timeWindow = [now]
xaxisWindow = [meter.x]
if yaxisWindow[0]-meter.y < 5 and max(yaxisWindow)-min(yaxisWindow)>40:
print "YSHAKE"
timeWindow = [now]
yaxisWindow = [meter.y]

When we determine that a shake has occurred, we print a message and reset the data collections.

The code will detect a shake of a phone in a downward direction and the preceding code works because down is positive. For the code to react the same way for an upward shake, absolute values of Y axis data must be appended to the yaxisWindow list. For this, the math module will need to be imported and the math.fabs() should be used. The same change could be applied for an X axis shake to the right.

While it is perhaps natural for a downward or upward shake to be used as a gesture, a sideways shake is not as natural. It seems more natural to move one's wrist left, which involves both X and Y axis movements. In fact, the movements combine left X axis movements with downward Y axis movements. The distance travelled is less than a single axis movement. We could detect a wrist shake with a small code addition:

if xaxisWindow[0]-meter.x < 5 and max(xaxisWindow)-min(xaxisWindow)>25
and yaxisWindow[0]-meter.y < 5 and max(yaxisWindow)-min(yaxisWindow)>25:
timeWindow = [now]
xaxisWindow = [meter.x]
yaxisWindow = [meter.y]
if xaxisWindow[0]-meter.x < 5 and max(xaxisWindow)-min(xaxisWindow)>40:
print "XSHAKE"
timeWindow = [now]
xaxisWindow = [meter.x]
if yaxisWindow[0]-meter.y < 5 and max(yaxisWindow)-min(yaxisWindow)>40:
print "YSHAKE"
timeWindow = [now]
yaxisWindow = [meter.y]

Finally, to act on shaking, we should replace the print statements with some other kind of functionality. We would normally use a callback function but the reportEvent() function is itself a callback and we cannot send it our own parameters (as would be needed to specify our own callback). The alternative is to hardcode a call into the code and specify the function later in the program.

Another common gesture might be to move a phone backwards or forwards along the Z axis. Extending the preceding code to look for these types of gestures is left as an exercise for you the reader.

Legacy sensor API

The sensor framework is available on devices based on the Symbian platform, and on older S60 devices (S60 3rd Edition, FP2 and later). Earlier platforms (S60 3rd Edition, S60 3rd Edition Feature Pack 1 and some S60 3rd Edition Feature Pack 2 devices) use a legacy "sensor API" instead.

The legacy sensor API is limited compared to sensor framework we've already discussed. It provides:

  • Access to a much smaller set of sensors than the sensor framework: accelerometer, tapping sensor and rotation sensor.
  • Limited ability to register for specific sensor information. Unlike the sensor framework, the sensor API provides a single callback for data from all sensors. Applications can use in-built filters to capture orientation changes and rotation changes, but have to create their own filters for other events.
  • The interfaces are not readily extensible.


This chapter showed how to use the framework and gave examples of using the accelerometer to control an application, and to interpret gestures.

The sensor framework allows applications to query for specific sensors and to register for notification of data from sensors of interest. The API is readily extensible so applications will not be adversely affected by the addition of new sensors in future.

Licence icon cc-by-sa 3.0-88x31.png© 2010 Symbian Foundation Limited. Portions copyright Bogdan Galiceanu, Hamish Willee, Marcelo Barros de Almeida, Mike Jipping, Pankaj Nathani and others in wiki document history list. This document is licensed under the Creative Commons Attribution-Share Alike 2.0 license. See for the full terms of the license.
Note that this content was originally hosted on the Symbian Foundation developer wiki.

This page was last modified on 8 May 2013, at 05:56.
36 page views in the last 30 days.