Thanks a bunch for your reply. I've implemented this and it almost works (frustratingly so). Actually the first time it worked perfectly, but I was disappointed to see that it was just a fluke (somehow the render and audio threads must have lined up).
Anyway the odd behavior I'm getting now is that the DSP callback seems to fire very consistently, but once in a while fires 10ms later than usual. I'm running at 48KHz and get something like the following:
The 30's occur regularly, around every 7-8 times. I'm running a 1024,2 buffer, as I raise it, the behavior becomes more frequent (every 2-3 calls) but remains to be a 10ms delta. This makes sense (I guess) since the time between callbacks is now much larger. I'm attempting to run my game logic synced to the audio time, since its a rhythm game, and this is causing some odd jittering (much better than before, without interpolation, but still noticeable). At a crazy buffer size around 10K, for example, the jittering only occurs rarely, but of course the sound latency is horrendous.
I think I can probably work around this, but it seems odd so I figured I would ask if this is expected behavior. Do you think something in my app might be starving the callback, for example? I've added it to the head node, as you've suggested, so it takes no CPU as it is. I've also inlined the timers for the test, out of paranoia that there is some wacky function delays going on. I even make sure not to printf between the logic in case there is an odd behavior there (there is no change even with the printf commented out).
- Code: Select all
static double last_read_time = 0;
FMOD_RESULT F_CALLBACK timerDSPCallback(
FMOD_DSP_STATE * dsp_state,
float * inbuffer,
float * outbuffer,
unsigned int length,
LARGE_INTEGER freq, time;
double ctime = (double)((double)time.QuadPart / (double)freq.QuadPart);//getCurrTime();
double delta = ctime-last_read_time;
last_read_time = ctime;
printf("delta %f\n", delta*1000);
The game logic works off the DSP's PCM samples (I should probably switch this to MS, but I wanted to run as low level as possible until everything sort of worked).
- Code: Select all
uint64 curr_time = sfxGetDSPClock();
double last_dsp_time = sfxGetLastReadTime();//ensure this is called first
double curr_sys_time = getCurrTime(); //and this is called second
double delta_time = curr_sys_time-last_dsp_time;
uint64 dt = delta_time * 48000;
curr_time += dt;
After which curr_time flows into the game logic. The odd part is, I can't see why the extra 10ms between the dsp calls would actually cause any issues, since I think it should just interpolate a longer interval. I don't hear any music chopping, which is also confusing since I didn't hear any audio chopping (I did this through addDSP first and memcpy'ed the in->out). My only guess as to why its jittering is that somehow the dsp clock is updating to the next block while the callback hasnt fired yet, and thus there is a mismatch between curr_time and the offset I should be adding.
Anyway, sorry for the code spew, I've mostly posted it in case there is something obvious I'm screwing up. If this isn't typical behavior and you can't think of something off the top of your head, I'll try to build a lightweight test app and run FMOD solo and see if I can replicate this.
Thanks again for the help, quite enjoying FMOD so far even with all this trouble.