Phasing Issues Automatic Weapon Sound

Hey everyone,

I have been trying to build a weapon audio system for automatic weapons, but I’m running into following issue:

If I fire the automatic weapon (loop sound with trigger cue) and another player with the same weapon is very close, the weapon sound sometimes starts phasing as soon as we both start firing. I already tried to randomize the pitch, put filters on the signal based on distance, used different samples but still this phasing occurs. This might become even more obvious as soon as 4+ player are using the same weapon sound.

My question: Is there any other way which prioritizes closer sounds (maybe via ducking) or is there any other solution for this problem? How are other multiplayer shooters resolving this issue?

Any help would be highly appreciated!!!

Best,
Ulrich

Studio does allow you to limit the number of instances of an event that can exist simultaneously in your project by setting the event’s Max Instances property in the event macro controls; If you select the ‘Virtualise’ or ‘Quietest’ stealing mode, only the loudest instances of the event will be audible. That said, this completely silences instances in excess of the max instances, so it’s probably inappropriate for your case.

1 Like

Thanks, but as you already mentioned it doesn’t really solve the issue. I already thought if maybe a sidechain could help, but that would also mean that it would sidechaining its own channel…

There’s a number of methods of preventing phasing. Randomising the pitch and using multiple different audio files are usually the easiest solutions, but since you’ve already tried those…

You could try applying a random amount of start offset to your sound modules. To randomise the start offset of a sound module, select the sound module in the event editor, expand its ‘Trigger Behaviour’ drawer in the deck, then hold down the ‘Alt’ key while dragging the ‘Start Offset’ knob. (You can also right-click on the knob and select ‘Add Modulation > Random’ from the context menu.) This prevents phasing caused by multiple events starting at the same time.

You could also try applying extremely steep distance-based attenuation to event volume. To create custom attenuation curves, set the Distance Attenuation of the event’s 3D Panner effect module to ‘Off,’ then add a built-in Distance parameter to the event and use that parameter to automate master track volume. While this won’t actually prevent phasing, it can reduce the effect of phasing when multiple instances of an event are playing at different distances from the listener.

1 Like

Hey Joseph,
Thanks for the reply and sorry for the late response. I tried all of your suggestions (which are a really good way btw) but the main issue is, that if I use different pitch randoms, the layers will be out-of-sync.

But: I think i solved it through separating the events between 1P and 3P. By this I can create snapshots (1P ducks 3P etc.). In combination with your suggestions it pretty much resolved the Issue!

Ah, you need the tracks of each event instance event to remain in sync? In that case, you should add a random modulator to the event’s pitch property rather than the pitch properties of its sound modules. You can find event pitch in the event macro controls.

1 Like