Answered
0
0

In FMOD 4 setting setDSPBufferSize would dictate how often the DSP clock would update and would mean I could play sounds with the accuracy of what that buffer size was set to.

In FMOD 5 I’m seeing that even though I’m setting setDSPBufferSize(256, 4) every call to getDSPClock is showing a difference of 1024.

So if I understand the way things work, it means I cannot schedule a sound to play less than 1024 blocks ?

Is there a way to have the mixer update with the same granularity of the buffer size like in FMOD 4 ?

  • You must to post comments
Best Answer
0
0

What you are likely seeing is the OS working in a higher granularity than FMOD, which results in FMOD doing multiple mixes one after another.

In FMOD 4 our policy was to be very hands-on with controlling the Audio Session settings which made it difficult for developers to work with multiple audio providing software, such as FMOD + video players.

With FMOD 5 we leave configuration of the Audio Session up to the user, this includes appropriate setting of Audio Session category, handling interruptions and latency control.

The Audio Session property you are probably most interested in right now is:
kAudioSessionProperty_PreferredHardwareIOBufferDuration. Making changes to this value should give you the same behavior as FMOD 4.

PS. I’ll amend our docs with this information…

  • You must to post comments
0
0

setDSPBufferSize() should operate the same in FMOD4 and FMOD5. Are you calling setDSPBufferSize() before System::init() and checking the return code. What does getDSPBufferSize() return after calling System::init()?

Scheduling of sound playback via Channel::setDelay() and volume ramping via Channel::setFadePoint() do not require clock values to be a multiple of the block size.

If you simply call System::playSound() then it will take effect at the start of the next mix block, in which case larger block sizes introduce latency.

  • You must to post comments
0
0

Yes setDSPBufferSize is being set before init and getDSPBufferSize returns the set value. So on that side of things it all looks fine.

Further investigation is showing the simulator seems ok, just on the device (iPhone 5S) it looks like the buffer size is still 1024 regardless of what I do.

I’m measuring by just calling getDSPClock in a thread on the master channel group and if it was different from the last just diiffing the value. On the simulator I’ll see 256 (for example) but always see 1024 on device.

In regards to setDelay / scheduling I think I was getting confused by the minimum time I could schedule vs the granularity. So as far as I understand it I basically have to do (bufferSize * 2) + specificSampleToDelayBy. For some reason I had it in my head I could only schedule on the buffer boundaries. So ignore that part of my comment now :)

  • Nicholas Wilcox
    (bufferSize * 2) is our "rule of thumb" safety margin. The mixer thread is running at regular intervals in the background, and if the clock value passed to setDelay() is in the past then there is zero delay and you lose the ability to perform sample accurate scheduling.
  • You must to post comments
Showing 3 results
Your Answer

Please first to submit.