FMOD Studio Features: Profiler Introduction

Checked with: Version 1.07.00  03/02/2016

We walk through features of the Profiler in FMOD Studio 1.06 that allow you to connect to, record and analyse the audio in your game.

 

Download FMOD Studio and the FMOD Project used from the downloads page Downloads Page

 
 

Transcript

 

(0:00) Hi, my name is Sally Kellaway and welcome to a tutorial that will update you on one of the more recent features of FMOD Studio. Over the last few months we have made several updates to the Profiler, which we will cover in this tutorial. The video for this tutorial covers the major update from 1.06, and we will look at extra functionality added through 1.07 and up to 1.08 in this tutorial.

(0:20) This tutorial will take you through the Profiler layout, and a explore the functionality of the Profiler as well. The follow-up tutorial in this mini-series will cover the API capture feature in Profiler. API capture (or Instant Replay) allows you to compare the recorded Sessions from your game with the current state of your events in FMOD Studio. We will cover this in more detail in the next video.

(0:41) So we’ll start off by making sure you know where to find the Profiler and what it does. The Profiler is a tool that records the output of your Game or Studio Project in Sessions, and shows data of a large number of metrics based on the recorded session. This tool essentially allows you to assess performance, bug check and iterate the mix of your project, both within the Studio environment and In-game as well.

(0:56) So the Profiler is super cool because it allows you to have multiple sessions that you can save, and you can view a range of data, both live and after the fact. So once you have actually finished recording the session, that gives you access to volume, voices, CPU, memory, lifespans and instances, both on the fly, and recorded (saved) so you can analyse them at a later date.

(1:01) This means that you can both view, and listen to individual captures of game playthroughs, to compare how your mix performs across many different play styles. The Profiler includes API capture function which you can use to edit and update the audio playing through your recorded sessions, so make changes to your Sound Events and hear the difference between the recorded and “updated” Events – including mix and design changes in the Mixer and Event Editor windows.

(1:44) I will show you a bit more to do with that in the follow up tutorial, but today we are going to jump straight into and explore the Profiler layout So this is the Profiler window, which can be accessed through the CTRL or Apple + 6 key shortcuts.

The Standard Profiler Layout when you open the Profiler.

The Standard Profiler Layout when you open the Profiler.

(1:57) In this left hand panel, you can create and see any recorded Sessions that you have. Sessions can be created by Right-clicking in this panel, or going straight ahead and hitting Record.

Creating a new Session can be done by right-clicking in the Sessions tab.

Creating a new Session can be done by right-clicking in the Sessions tab.

(2:05) Now I am going to call this ‘example’ because it is an example. As you can see I already have a project that is open. So this is just the project from the Unreal Engine 4 video tutorial series. In 1.07, the profiler was updated so that when you record a new session it automatically populates with the events that are triggered.

In 1.07+ When you record a Profiler Session, the active Events will automatically populate the Session.

In 1.07+ When you record a Profiler Session, the active Events will automatically populate the Session.

(2:33) If you are working in 1.06, a new Profiler will only load up the Master Bus and any other Buses that you have within the project. So you will need to, if you want to view the data for the different event separately, drag and drop each Event into the Profiler session from either the Event Editor or from your Event Browser.

To manually Scope in Events (1.06 and earlier), drag them in from the Event Browser.

To manually Scope in Events (1.06 and earlier), drag them in from the Event Browser.

 

(2:54) So now that we are ready to record, you can populate the view with additional data graphs if you want to see any specific data upfront. This will be necessary if you are working in 1.06 or earlier.

Data Graphs can be pre-loaded to view before recording.

Data Graphs can be pre-loaded to view before recording.

(3:04) Alright, so now we will start recording with the Profiler, and we will investigate what each of the data graphs show.

 

(3:20) So I am going to play the environment ambience sound.

(3:26) Then I am going to hit record in the Profiler, and you will see it will start recording and it is showing us some data.

(3:33) So, these radio buttons up the top will let you select which graph is shown for the selected Track.

Switching between the Data Graphs is done with the radio buttons.

Switching between the Data Graphs is done with the radio buttons.

(3:38) At the moment we have got the instances graph shown, but I’ll talk you through everything from the “Levels”.

(3:43) So the “Levels” option gives you the RMS, which is the ‘negative 35′ value, and the peak values as well at the same time in the Data lane. So, to see the finite value, just mouse over the graph, it will show you the value at that point on the Timeline (and that is the same for any of the data graphs).

The Levels graph shows the RMS and Peak of the Event Tracks.

The Levels graph shows the RMS and Peak of the Event Tracks.

(4:01) Memory is the next option, which is the Global Memory Use of the Master Bus, now this event has very low memory usage, you can see it is using practically no memory usage.

The Memory Graph shows Memory usage in KBs.

The Memory Graph shows Memory usage in KBs.

(4:05) We will have a look at the CPU graph now. The CPU graph shows the milliseconds of CPU time taken to process the event when it is actually being played.

The CPU Graph shows the CPU time taken to process the event.

The CPU Graph shows the CPU time taken to process the event.

(4:13) The Voices chart tells you how many voices are active at any one time from that Event track. Now, the voices you can also think of as a module from that Event that has been played. Within this Event, we have many modules, because there are scatterer sounds as well, we have many voices that are active at the same time. This graph shows both the “Real” Voices (solid graph), as well as the number of “Real + Virtual” Voices (transparent graph).

The Voices graph gives you access to Total and Self-triggered voices.

The Voices graph gives you access to Total and Self-triggered voices.

(4:30) The Lifespans chart will show you the current lifespan of that event. In the video, we pressed play on one particular sound event (which is playing that through Studio) the Lifespan for that event is just that one, and it is continuing on as long as the Event is playing. When there is more than one Instance of the Event playing, you will see more lines in this chart.

The Lifespans graph shows you the total duration of the Event.

The Lifespans graph shows you the total duration of the Event.

Now if you hit the radio button in the Graph selection panel, you can switch to see how many Instances are active for that Event. This graph shows both the “Real” Instances (solid graph), as well as the number of “Real + Virtual” Instances (transparent graph).

(4:47) For this event at this point in the video, there is only one instance because we only pressed play on one event in one Event Editor window. When we connect the Profiler to a whole game, you’ll see a count of all Instances of the total Event in that event’s Data Graph.

The Instances graph gives you access to the Total and Self triggered instances of the sound.

The Instances graph gives you access to the Total and Self triggered instances of the sound.

 

(4:57) On the right hand side here we have this 3D view panel. The 3D view shows you all Instances that are in various states of activity during your Profiler session. Events that are Playing are shown as filled circles, while virtual events are hollow circles. Events that have been Stopped become dimmer and fade over time. Clicking on an event in this view highlights its associated track in the main Profiler window.

(5:01) Now, in this example from the video, because this event has no particular 3D panning that is happening at any one time (it has a static location), we don’t see any particular 3D panning come through.

The 3D Panner plays back the positional and Play state data for each Instance.

The 3D Panner plays back the positional and Play state data for each Instance.

(5:10) Now, if any of our sounds have 3D position automation, or if 3D panning information that will be recorded in the Profiler. So I will just move it left to right as well. The automation on this Event will also be visible in the Profiler.

The 3D Panner plays back the positional and Play state data for each Instance.

The 3D Panner plays back the positional and Play state data for each Instance.

(5:19) Now if I were to select that particular instance in the 3D Previewer, we can see the parameters associated with that event in the Deck area. For this event, that is the Distance parameter information. And as we can see, not only is it showing us the current 3D position during that particular part of the playback, but it is showing us the distance value that was associated with that as well.

Selecting the Instance will show further details and Parameter information in the Deck area.

Selecting the Instance will show further details and Parameter information in the Deck area.

(6:20) Now we can load up the parameter information as a separate graph just by right clicking on the dial and Selecting parameter graph.

Right click on the Parameter Dials to expose them as a Graph.

Right click on the Parameter Dials to expose them as a Graph.

 

(6:36) Now that is the crash course in the Profiler UI, we have gone through the basics of setting up the Profiler to record Events in the FMOD Studio Environment and explored what each of the data graphs mean. We even looked at some advanced functions of the Profiler, like having a look at the positioning of the Event, and then also displaying each of the parameters that are associated with that sound.

(7:11) Please join us in the next tutorial, where we will be hooking the Profiler up to Unreal Engine 4 and looking at the API functionality.

(7:21) All of us at FMOD would love to thank you for joining us and we all hope to see you again soon.

Credits and Attribution

Assets in the asset pack are provided by the Sound Librarian, Soundwave Concepts, Mixamo, Epic Games and Sally Kellaway by herself at Firelight Technologies. Please refer to the Read Me document for further information on licensing, attribution and commercial distribution.