In our custom build of UE4 we patched the XAudio2 module so that we can put a per-source lowpass filter on the direct sound path, while leaving the reverb sound path unfiltered. This realizes a poor-man obstruction filter. We are now evaluating the use of FMod, but since it basically replaces the XAudio2 module, our filtering capabilities would be lost. Could you please provide some hints about how to re-implement a similar filter?
Our next patch release will have reverb and ambient zone support.
The code for the ambient zone applies a low-pass effect per instance based on the listener and emitter position. That sounds very similar to what you want. I would get the next release (in 1-2 weeks time) and look to modifying the code in FMODAudioComponent to set the low-pass value based on your desired logic (e.g. ray-cast to find obstructions).
In the above system you won’t need to add an Occlusion parameter per instance, but you do need to add a low-pass effect in the event’s master track. The UE4 integration will then find it and drive the cutoff parameter.
After you get the release I would be interested in what think of it. We might want to either add a general way of driving lowpass per instance for these ambient sounds (e.g. adding a blueprint function to do it), or else just incorporate ray-cast occlusion code straight into the audio component directly if its generally useful.
I’m thinking to your dynamic occlusion filter since few days. Sounds nice and realist i would like something like this in our game. I also wonder if it could be possible and if you planned to combine this with the “ambient zone system” ? or as a unique feature to control what is hear behind walls/meshes ?
It may help to explain a related feature we are adding soon.
One of the next high priority items on our list is to support the inbuilt UE4 concept of ambient zones. In the UE4 sound system, this has the ability to apply a lowpass DSP and gain modification on certain sounds (those with a sound class that sets bApplyAmbientVolumes).
To support a similar concept, sound designers can set up events by doing the following:
- Add a low pass DSP to their event(s) in Studio
- Add a “Ambient” user property to their event(s) in Studio
The integration will then look at the user properties, and if the Ambient property is set, it will apply a gain change and drive the lowpass DSP manually, depending on the reverb zone the listener is in. So the plan is that the integration code won’t create the DSP effect but it will find it and drive the cutoff if it exists.
If this is something like what you need to do, you can wait until we add that feature and see the approach we took.
However I’m not sure exactly why you would want to a put a lowpass on every instance across the board.
You might be able to do everything within the Studio tool itself by adding lowpass to certain buses and triggering snapshots to drive them. That would allow designers to set up the effects however they like.
- I defined an Occlusion parameter and I am now able to change its value at runtime. This, the designer can use it in Studio to automate the proper effect. However, the effect has to be applied to every single event. Is there a way to simplify things? Consider that each event will have a different value of the occlusion parameter, so routing the effect through a single bus is not a viable option.
- You must login to post comments
The feature you are describing sounds interesting, but I am not sure it may help me. What I am trying to obtain is a dynamic obstruction filter. Consider the following game play situation: we have a room with a large column in the middle. We thus have a single reverb zone, but the sound source is heard differently according to its position relative to the listener and the column. If the direct path between listener and source is unobstructed, the dry sound and the reverb sound are unaffected and no filter is applied. However, if the direct path is obstructed by the column, the dry sound is muffled, while the reverb is still unaffected: in this situation, I apply a lowpass filter on the dry sound only, while the reverb stays unfiltered. Since both the listener and the sound source may move, we have to change the filter parameters dynamically at runtime. I have all the machinery working with XAudio2. I read the low level API of FMOD and I believe it has all the functions I would need to get the same effect, but I would rather not go to the lowest level if there’s a better way.
An even more complex filter we have implemented is to take into account the perceived position according to the presence of obstacles: for example if the sound source is an different room, the sound will be perceived as coming from the door connecting the two rooms, rather than from the actual 3D position of the sound source. We achieve this effect by changing the sound position before applying spatialization (this is done as part of the Unreal sound processing, without actually moving the SoundComponent that produced the sound). It would be nice to have this too.
Please login first to submit.