Because stereo streams aren’t supported in 3d and also because I needed accurate synchronisation between front and rear outputs, I decided to buffer a stereo stream by buffering the sound in a stream dsp callback and then playing this buffer in 2 or 4 mono streams.

I currently have some code that works prety well.
It creates 4 new streams, one located at every corner so I can decide for myself at which place I want each source channel.

As I said the code works most of the time, but sometimes upon the creation of the 4 new streams, they aren’t perfectly syncronized, even though the play and unpause functions are immediately after each other with no other functions in between.

Sometimes I can detect this by checking the difference between FSOUND_Stream_GetTime for 2 of the streams, which is 23 ms when they aren’t in sync and 0 if they are in sync.
Now I can’t sync them exactly with SetFrequency because the GetTime is only updated in 23 ms (I suppose fmod’s buffersize) intervals.

It also seems it only plays fine when the buffersize of the streams is 4000 samples or more, and I had hoped it would work with a buffersize of 1000 or 2000 so I could reduce the lag for effects.
Why are dsp effects possible in 1000 sample intervals for the final output (regular dsp’s in the dsp chain) and not for streams?

  • You must to post comments

So with FSOUND_GetCurrentPosition I should be able to sync hardware streams myself?

It isn’t software mixed because I need the rear channel, so I have 4 HW3D channels, 1 in each corner.

Thanks for your help, I will try it and hopefully it works good enough.

There is no solution for the latency of streams?
You can add effects to the dsp chain with a buffersize of 23 ms, but a stream seems to need a buffer of 100 ms to work withouth stuttering.

  • You must to post comments
Showing 1 result
Your Answer

Please first to submit.