0
0

I have noticed that interval playback sounds slightly different on Mac, Linux, and Windows using DSP Delay for sample-accurate interval boundaries.

I have written an example that plays 200ms of a beep file, ending at the millisecond provided at the command line.
For example, if you run "./main 1000" it will play back 800ms-1000ms.

On Mac, using the provided beep file, you first hear something with input 901. However, on Linux you don’t hear anything until 917. I have tested this behavior on 6 different machines.
On Windows I think the offset is even greater but it’s simplest to keep it at just Mac and Linux at the moment.

My example can be found here: http://memory.psych.upenn.edu/files/sof … t_demo.zip
Makefile is included. Please make sure LD_LIBRARY_PATH includes current directory ".".

Thanks in advance for taking a look, and thank you for providing a fantastic free audio API.

  • You must to post comments
0
0

It sounds to me like you are scheduling too close to the current time / not starting exactly when you want. As the DSP clock is constantly ticking in the background, you want to base you start / stop delays in the future a bit. The clock generally updates in chunks of 1024 samples (see System::getDSPBufferSize for exact).

A safe number of samples in the future would be 2 * dspBufferSize, so 2048 samples. To ensure your beep plays for the full 200ms I recommend you set a start and stop delay (200ms apart). Otherwise the sound will start on the next mix boundary.

  • You must to post comments
0
0

Thanks for your help.
I implemented your suggestion (if I understood it correctly) and I still get the same behavior.
In this case I delay the sound by a full second:
[code:13si0mkx]
startDelayFrames = 44100;

FMOD_System_GetDSPClock(fmsystem, &hitime, &lotime);

hiclock = hitime;
loclock = lotime;
FMOD_64BIT_ADD(hiclock, loclock, 0, startDelayFrames);
result = FMOD_Channel_SetDelay(channel, FMOD_DELAYTYPE_DSPCLOCK_START, hiclock, loclock);
if (result != FMOD_OK) {
    fprintf(stderr, "FMOD error: (%d) %s\n", result, FMOD_ErrorString(result));
    fprintf(stderr, "cannot set start delay\n");
    return -1;
}

hiclock = hitime;
loclock = lotime;
FMOD_64BIT_ADD(hiclock, loclock, 0, (endFrame - startFrame) + startDelayFrames);
result = FMOD_Channel_SetDelay(channel, FMOD_DELAYTYPE_DSPCLOCK_END, hiclock, loclock);
if (result != FMOD_OK) {
    fprintf(stderr, "FMOD error: (%d) %s\n", result, FMOD_ErrorString(result));
    fprintf(stderr, "cannot set end delay\n");
    return -1;
}[/code:13si0mkx]
  • You must to post comments
0
0

[quote="ymasory":xeaj07nv]I have written an example that plays 200ms of a beep file, ending at the millisecond provided at the command line.
For example, if you run "./main 1000" it will play back 800ms-1000ms.[/quote:xeaj07nv]
What exactly are you trying to achieve? Do you want to:
[list=a:xeaj07nv][:xeaj07nv] Play a 200ms beep that ends at the specified time[/:m:xeaj07nv]
[:xeaj07nv] Play a variable-length beep that starts immediately[/:m:xeaj07nv]
[:xeaj07nv] Something else[/:m:xeaj07nv][/list:o:xeaj07nv]
The reason I ask is that your description seems to suggest (a), but your example seems to be trying to implement (b).

Specifically, you’re setting the sound’s initialseekposition to startFrame. This doesn’t change the time at which the sound starts playing – the sound will still start immediately. initialseekposition just "fast forwards" the sound to the given position. Thus, when you run "./main 900", the sound starts playing at position (900 – 200) = 700 milliseconds, and stops playing at position = 900 milliseconds. Since your beep.wav is silent for the first 900 milliseconds, you don’t hear anything. As you increase the end time beyond 900, you will hear beeps of increasing length.

The difference between platforms is due to variation in the delay between calling PlaySound and the sound actually starting. This is expected behaviour, and can be made predictable by setting FMOD_DELAYTYPE_DSPCLOCK_START, as Mathew suggested.

If you can explain what you’re trying to achieve, we can suggest the best way to achieve it.

  • You must to post comments
0
0

Many thanks for your help Ben.
I am trying to do A.
So the behavior you describe is exactly what I’m trying to achieve. If
someone runs "./main 900" I want 700-900ms to play back, as accurately
as possible. Accuracy of playback is more important than latency, so I
have no problem delaying playback to the future as Mathew suggested.

As you note, as I increase the end time beyond 900 I should hear beeps
of increasing length. Unfortunately beeps only begin at the expected
901 on Mac. On Linux beeps begin at 917. On Windows it seems a bit
more variable, but in the range of 975-1050.

I have implemented Mathew’s suggestion. I think you looked at my
original example I posted before Mathew’s suggestion. Here is a link to a
revised example, which does implement FMOD_DELAYTYPE_DSPCLOCK_START to
delay the start of playback by 1 second (I tried 2048 frames as well). http://memory.psych.upenn.edu/files/sof … _delay.zip While the delay is successful, the
end-chopping behavior in Linux and Windows is unaffected, it just moves the entire playback sequence one second forward.

  • You must to post comments
0
0

I have looked into your problem and I see what is going on.

When using setDelay you need to deal in output samples not input samples. What I mean by this is while you are feeding FMOD with a beep file that is 44100Hz, the FMOD system operate at 48000Hz (this can be retrieved via System::getSoftwareFormat). So when you do your end frame calculation you need to base it on the system rate, not the input rate.

Your initial seek position will still need to be in input samples though (so 44100Hz in your case).

The reason this works on Mac is most likely because the sound card you are using defaults to 44100Hz, as a result FMOD will run at that rate. If you have any questions about modifying your example to get it working please let me know.

  • You must to post comments
0
0

Thank you for your help Mathew, you are exactly right. Dropping the input rate = output rate assumption fixed the behavior on Linux and one Mac which was later found to be operating on 48khz.
It did not solve the much larger end-chop in Windows, but I’ll post on that separately when I get the minimal example compiling with Visual C++.

Also, I couldn’t find anything in the documentation that gives the input’s sample rate. I had to pass it in from another part of the program. The API has some way of determining input sample rate right?

Thanks again!

  • You must to post comments
0
0

You can get the input sample rate via Sound::getDefaults, the first parameter is frequency.

  • You must to post comments
Showing 7 results
Your Answer

Please first to submit.