I am a new comer to moblie progamming. And in the C++ library document I got that there are some classes for using codecs and decodes provided below. But I have now idea how I can do this in Java? Because I want to play stream videos and autdios.
By the way, I only see the http exaple when streaming is regarded? Does Nokia sdk support any other protocols such RTP?
If any, is it possible to combine this with the codecs and make a mutimedia play?
Basically RTSP is supported on newer phones (with MIDP), but not RTP (unfortunately streaming does not work on the emulators, so you need to test on real devices)
Also MIDP implementation does not necessarily support all tghe various file formats supported natively on the device. The system properties help you in there http://www.forum.nokia.com/document/...0ACF5F95A.html
And no, you cannot add a new codec or the phone's Java ME implementation as there is no low-level access to the loudspeaker, microphone, or camera.
I have downloaded the j2me library2.3 document and read some. But as a newbie, I have no idea which part is most revelant to my target. From your answer I get the part that I need to read further.
And for the last question, there may be some misunderstanding. What I want to do is a multimedia conference client using sip and rtp on mobile phones. And there is no existing files on the server. It is just generated realtime. So I was just wondering whether rtp is supported and if it is how can we make the player play the media from the rtp stream or using the codec supported to get the proper media stream for rtp to transfer to server.
Now that rtp is not supported, does it mean that I have to port a rtp stack on to it? Is it possible? or there is other tricky ways?
The MMAPI documentation is relevant, as MMAPI is the API used to play/record video and audio. (as I said above it is a really high-level API...)
And in your case, you should also check the SIP API documentation.
Implementing RTP protocol might be possible (to my understanding it is just some extra stuff on top of TCP/IP = socket connection). However the problem will be in getting the Java ME implementation to play the content streamed over RTP continuously. Or to send continuously the recorded content over RTP (as you cannot access the recorded audio/video data until you have stopped the recording).
So basically you have to use the protocols and formats supported by the platform.
Does this help at all, or just make you more confused?