Question about Live Update functionality

Hi everyone, so I just have a question about how Fmod’s Live Update works, as per observation.

I found that when live update was connected and working, I would hop over to Fmod and try to monitor which functions of the software were tracking with my game at runtime.

To be honest, the only thing I found to do this was the Mixer page or window that you can call up in Studio. And with a little bit of limited functionality - as snapshots were triggered by the game state, I would audibly hear ducking come into effect, for example, or see the actual incoming signal in each channel meter dropping to accomodate for the snapshot. The channel fader itself did not move like a typical automated fader might behave in a DAW of some sort, like Protools, for example…

I was also trying to observe my live, real-time or I suppose, run-time 3D pan effects. My 3D sounds in my game were indeed 3D, and I even wrote specific ‘surround extent’ instructions into the Deck area of my audio track in the Fmod event, the loop regions in the event browser, as well as playhead playback… none of this stuff is active or reading back like live automation in Fmod when my game is running, and live update, activated.

So I suppose my question is… these would be nice features to have in Fmod. Is there something I’m doing wrong, should I be seeing these things happen real time in Studio? Or is the Live Update feature purposefully configured in a way not to do these things as to minimize the requirement for system resources, etc.?

I also had a look in the Fmod documentation, and there is no mention of live functionality updates like the ones I outlined, aside from the mixer.

Are you familiar with the Profiler Window? In it, you can create and record all API calls made during a live update session, and inspect in great detail the behavior of your project’s content. You can even inspect the parameter values of each individual event instance and see how they change over time. It may be what you need.

In the most recent version of FMOD Studio, the profiler is described starting on page 54 of the FMOD Studio Getting Started Guide, and on 213 of the FMOD Studio User Manual.

As for displaying the behavior of your game’s events within the event editor, there’s a good reason why we don’t do that: The Event Editor can only display a single event instance at a time, and it’s a special instance that exists for the purposes of auditioning. Your game project, by comparison, could easily be playing multiple instances of any given event at any moment. There is no reasonable way for FMOD Studio to predict which event instance you wish it to display - and even if there was, it is more important that the instance used for auditioning remains displayed, for usability purposes.

1 Like

Thank you for your response, Joseph.

I already know about the profiler functionality. Allow me to just confirm that we understand each other:

When I ask why Fmod doesn’t display all of your events live, what I mean by this is… Say I have my Unreal game connected and running, for example, and say I put my pc in the centre of the action of the level.

There could be numerous audio events being called and trigged all at once, a complex or dynamic mix happening. Why is it that when I place my pc in the centre of all of this, then switch my screen over to the live connected Fmod Studio window, to have a closer look or inspect the events happening, I don’t see any of the dynamic changes I made in Studio playing back with the game, in real time?

This is why I wondered if it was a performance issue, and these particular items are not tracking live in order to keep system resource use down.

Sure, in the event that your pc is in a location where only one event resounds, it makes sense for the instance of the event editor to just display this one item playing back in detail, so for that, I think we are on the same page…

But I don’t even see things like playheads moving in the event editor, for instance. No live tracking of the automation I’ve written, etc.

I also understand that perhaps profiler provides the solution to this scenario as well, giving you a more detailed recording over time of how your game was controlling the audio, with API connect.

There could be any number of instances of a given event playing in your game project, but the event editor window can only display a single event instance at a time, and has no way of knowing which one you would want to display. In addition, it needs to display the auditioning instance. Therefore, rather than forcing you to guess which instance is displayed, it consistently displays the auditioning instance.

1 Like