I'm writing a Series 60 1.x app that includes sound. The app has a tone player, based on a class that looks like this:
class CTonePlayer : public CBase, public MMdaAudioToneObserver
The app also has a sound player, based on a class that looks like this:
class CSoundPlayer: public CBase, public MMdaAudioPlayerCallback
The tone player plays a tone every time a bitmap moves. The sound player occasionally plays a wav file. These both work fine. At times the tone and wav file must play simultaneously. That's the problem. When that happens, afterwards one or the other of the sounds won't play any more. I suppose the play complete callbacks are getting confused. Do I need to somehow play these in separate channels? I can't find any documentation on doing that. Any ideas?