Brett, thanks for all the answers you’ve provided for us so far, its been really helpful. Right now i’m stuck with dsp effects on streams.
Essentially I want to run custom dsp effects on individual sounds, so i’ve opened up those sounds as streams and registered my own dsp callback. However the callback is told a length of 8192 samples, which seems pretty high, and there doesn’t seem to be that much data available. In fact, my app crashes just while reading this data from the provided buffer, indicating to me that there aren’t really 8192 samples available.
I’m guessing this is 200ms worth of data – the default buffer length for samples. I tried running the dsp callback as a standard dsp, using fsound_dsp_create, and the callback then received 512 samples which all looked ok. 8192 is 16 times more than 512, and since I know ordinary dsp units receive 25ms of data at a time, it strongly suggests to me that 8192 is 200ms worth of data. But if you could clear this up for me, i’d be greatful.
So am I accessing this data wrongly? I’m treating it as 16 bit stereo data, and the mixer mode is definately FSOUND_MIXER_QUALITY_AUTODETECT. Do DSPs running on streams receive a different type of data? The source type of data perhaps?
- You must login to post comments