0
0

One way to create fast-responding interactivity to music is to have the base melody and more intense drumming on separate tracks, and fade the intense drumming layer in and out in accordance to the on-screen action. This allows the music segment to be manageably long but still gives fast response time.

I figured you can do this by playing the drumming as a flourish – but how do you adjust the flourish’s volume during playback?

Also, is it possible to apply some other realtime FX, such as filters, to the music segments?

  • You must to post comments
0
0

Hi Peter,

Unfortunately at this point Designer does not handle segment level envelopes nor routing of segments to separate categories (roughly equivalent to a mix channel).

Dynamic mixing is on our roadmap for interactive music although it is not currently scheduled for the near future.

Thanks for your feedback.

  • You must to post comments
0
0

Thanks for your response.

The FMOD Designer Events kind of support this actually, you can add several layers to them and add envelopes to those; and overall they have some similarities to Interactive Music segments (like queued play). I toy with the thought of being able to use FMOD Events as Segments in interactive music, and use the Interactive Music scene editor as a way to link the Events logically (and tag them with BPM metadata). And that the parameters in the Events could be accessed and adjusted from the Interactive Music side as well.

I may actually be able to achieve what I’m trying to do with FMOD Designer Event Editor alone. It won’t allow logical branching like Interactive Music does, but the music only has a linear "intensity" progression (thus it only needs one parameter), and the intro and outro can be triggered via scripting. I’m going to give it a shot. :)

  • You must to post comments
0
0

Unless new changes have been introduced, events cannot seamlessly stitch from one sound definition instance to another – there is an audible gap.

It is probably my single biggest complaint about Designer, since the music API and the low-level code support is there to do gapless stitching and sentencing.

Something about the Event API in particular must be creating the gap… :(

  • You must to post comments
0
0

[quote="Symbiotic":2fdwboqe]Unless new changes have been introduced, events cannot seamlessly stitch from one sound definition instance to another – there is an audible gap.

It is probably my single biggest complaint about Designer, since the music API and the low-level code support is there to do gapless stitching and sentencing.

Something about the Event API in particular must be creating the gap… :([/quote:2fdwboqe]

The low-level code support for stitching has only recently been added. It was added specifically to support the music system, and the event system has not yet been upgraded to take advantage of it.

We do have plans to add seamless stitching to the event system. Look out for it in a future release.

Thanks,
Ben

  • You must to post comments
Showing 4 results
Your Answer

Please first to submit.