Based on what I’ve recently seen it seems to be a bit of a mystery what you should do when moving your app from non-touch screens to touch screens. Here are three things for you to consider!
1) There’s no visible “focus” on the screen on touch UI
When operating with non-touch phones you usually have a navigation key that you use to move around on the UI elements. What is shown on the screen is the “focus”; some kind of a highlight on the element that is currently active and that an action would apply to. For example, you use the navigation key to move the highlight on top of the list item you want to do something with, and then “Select” the item with a key press.
On a touch screen this kind of interaction is no longer needed, because you directly touch the item you want to interact with. For example tapping on a list item opens it. It’s very important from the user’s perspective that the first touch on the list item opens it (or triggers the action intended). If two separate taps are required to trigger the action, the user feels betrayed! “Why didn’t that happen with the first tap?”.
Focus shown on non-touch grid menu, no focus on Full touch grid menu
2) Content follows the finger
Again, on non-touch phones you need to work with the navigation key. When any content is scrollable, you use the navigation key to move the content down and back up again. The scroll bar on the side of the content moves in the same direction as you are pressing the navigation key: when pressing navigation key down, the scroll bar moves down as well.
When moving to a touch UI you are no longer working with the scroll bar in the same way; the scroll bar is only an indicator hinting to the user that there is more content than can fit the screen. It tells you the relative position in the content. But when you actually want to scroll the content, you flick or drag the content with your finger. And this is where the logic now changes; you are not moving the scroll bar anymore (moving downwards to show more content below the fold) but instead you are moving the actual content with your finger (moving the content up the screen to see more). So pressing the navigation key down actually equals flicking up!
When flicking a list up, the content moves up
3) Touch areas have to be large enough.
UI element size on non-touch phones is not terribly critical, since you use the (famous!) navigation key to highlight the wanted item. So the only variable affecting size on non-touch is basically font size; you should always keep the font size readable. For a Series 40 non-touch MIDlet with 240×320 resolution this would mean that no smaller than a 16 point font should be used.
I cannot emphasize enough the importance of keeping your touch UI elements large enough for pleasant finger use! It really becomes painful to use an app where you need to very carefully place your finger on an element because, a) it’s so tiny that you cannot really see if you are hitting it or not (due to your finger blocking the view) and b) the margin between different elements is so small that you are likely to hit two or more elements at the same time. UI elements that are too small on a touch screen basically make the app unusable on the move; consider sitting on a bus and trying to hit your finger on a precise place on the screen. It’s not easy and you will probably miss more often than hit on the first try.
The recommended touch element size is 7×7 mm which equals to 43×43 px on an Asha Full Touch. The gap (empty, untouchable area) between two elements should be 1 mm (i.e. 6 px on Full Touch).
Recommended minimum touch element size on Full touch phones is 43×43 px with 6 px margins
Do you have a specific UX concern in mind you would wish to hear more about? Let me know and I’ll try to answer it!