I’m writing a custom DSP. However, when I add it to a channel playing a sound, FMOD does not stop playback of that sound. Instead, I get weird noise after the end of the sound. Presumably, this is because my custom DSP read callback does not check for the end of the input data. How is this supposed to work? How can a DSP detect that there will be no more valid samples in the input and how can a DSP signal that there will be no further valid output (my DSP has a latency of length/2)?
- gmueckl asked 7 years ago
I can’t quite believe that nobody knows the answer to this one. If I haven’t stated the problem clearly enough of if you need more information, please feel free to ask. Any kind of help is appreciated. To me, this is a quite annoying problem and I’m not getting closer to a a solution by myself.
DSPs aren’t aware of higher concepts like file length, they are simply processing data a chunk at a time.
If invalid data is flowing into the DSP callback then something is wrong with however the audio is being fed in.
Can you provide some sample code to demonstrate what you are doing? Alternatively, have you looked at the dsp_custom example?
Thanks for your offer to help, but I finally found the problem: in a terrible case of user error I failed to notice that my main loop was actually missing a periodic call to System::update(). Adding that call has fixed this problem.
Please login first to submit.