The easiest way of logging is to add
System.out.println() lines to your code. This can be useful in emulator environment, but in real devices something else is needed. One efficient way is to use Microlog logging library. It supports Java ME and Android and makes it possible to log to file, to PC via Bluetooth and also to servers online. Only couple of lines of code need to be added to the MIDlet source code.
Read more about logging with Microlog from Developer Nokia Wiki.
Speech recognition and Natural Language Understanding (NLU) are big topics these days. More and more mobile devices with integrated, voice driven personal assistant application are being launched, and users are becoming used to this kind of technology.
Being able to tell your phone to “Schedule a meeting with John tomorrow at 5PM” is something that relieves users from tedious and repetitive tasks that are normally accomplished by opening an application, interact with a few touch controls and type words into a form.
Nokia Asha uses swipes for some platform features that cannot be disabled:
- Swipe from the top opens the notification panel
- Swipe from the bottom opens the Options menu (this will only apply to apps that are using Commands); if Options menu is not there, nothing happens with the bottom swipe.
- Swipe from either side closes the currently open MIDlet
Lately, I’ve received several similar questions from developers, and in response, I’ve decided to write some words about why one size doesn’t fit all, i.e. why you cannot create one app that would beautifully run on all phones. (My post on a related topic is here.)
I’ve recently had some interesting discussions with developers about user experience, and how it affects project costs. There seems to be two quite opposite ways of thinking.
Many times people think that “adding UX” will increase the costs. I think that “applying UX” will decrease the costs. And I fully agree with both statements! So, what’s the difference?