I was wondering if you more-experienced FMODers might be able to give me a quick impression about minimum systems for a project I”m finishing.
It plays nine stereo .ogg 128-bit encoded streams at once (alot, eh? What can I say, I’m a music/composer guy :smile:).
I wrote it on a PIII 933mHz w/256M. Works great there. It stumbles unbearably on a PII 200mHz w/32M. I’ll keep testing on more machines (and OS), but I was hoping someone might be able to give me a quick impression from their experience so I know what to expect.
Thanks in advance!
<font size=-1>[ This Message was edited by: pdevault on 2002-03-13 16:51 ]</font>
In case anyone else is interested, I was able to get information on the RC3 .ogg VBRs I used from the crew at Goldwave.
I had encoded the audio files using Goldwave, an audio editor. You could choose many different bitrates for .ogg VBR files and after the each bitrate was an indication like (.4q).
It turns out that the VBR bitrates indicated were average rates and the ‘q’ number (always between 0 and 1) was an estimated quality achieved with that average bitrate. This quality factor is, I assume, something Goldwave came up with and assessed.
That’s the story.
However, newest processors will perform surely better than older processors at the same clock speed.
I mean: a PII at 233 MHz will take less time to decode an ogg than a Pentium MMX at 233 MHz. This happens because of istruction optimizations of the newest processors.
The only processor you can’t apply this behaviour is the Penitum IV
Thanks Brett & Blackshard for your continued thoughts on this topic.
Brett: Excuse my ignorance, but when I encoded my .ogg files I could choose from 500 kpbs VBR down to 64 kpbs VBR (all RC3). I chose 128 assuming that it would be more CPU efficient (and I found the sound still acceptable) even though all rates were labeled variable. Is this wrong? Doesn’t the kpbs rate factor into CPU demands?
<font size=-1>[ This Message was edited by: pdevault on 2002-03-16 09:23 ]</font>
Please login first to submit.