I have what should be a simple app, that does an Init, sets a Buffer, and then simply loops, listening to whatever is on the Audio.

This does NOT need to be real-time (ie delays are not an issue), but it
DOES need to be frequency and phase precise to the analysis loops.

This all works, and gives nice, expected precise results for Tests at 100Hz, 200Hz, 300Hz and all 100Hz harmonics to 900Hz
– The storage scope emulation I have shows very stable sync’d waveforms for minutes at a time.

However, if I move to 199Hz or 201 Hz, the very clean plot starts to
fuzzzy-up, (but is still broadly locked) and tho 700Hz is solid, 350Hz has 180′ phase jumps (and the frequency calculated is likewise off).

As this is all time-domain SW, I am stumped as to how it even ‘knows’ it is reading 100Hz, etc. The ones that work are not even integer-scales of the sample Frequency. (eg 600Hz is 73.5spc and 400Hz is 110.25 spc ?)

It is also puzzling why 700Hz is fine, but 350Hz is not ? (and 300Hz is OK, but 150Hz is not ?) etc …

I can vary SW sleep times, and the BUFFERSIZEs I can find, and nothing changes this effect. Stereo/Mono also seems to not matter.

Setting the EXE Priority in Control Panel does help reduce the phase-
variations, but it does not eliminate them.

So it really does not look like an upper layer SW issue, but more like some low-level issues in FMOD/Windows.

Does anyone have any ideas ? or has Solved this problem ?


  • You must to post comments

A slight update:
I did find that putting
inside a loop, and waiting for maybe Buffer/9 (or any reasonable chunk)
of bytes ready DID make quite a difference.

This in spite of the nett loop-spin times being paced to be similar.

  • ie it seems WHERE in the loop you wait, matters a lot.

It is now MUCH less Frequency-paranoid, tho windows HIGH priority still helps too, so it seems this low level Windows/HW interaction is something of a lottery ?

  • You must to post comments
Showing 1 result
Your Answer

Please first to submit.