Audio Fading and Crossfading with Positional Emitters & lerp()

3

Index

Attached Files

The following files have been attached to this tutorial:

.capx

lerp-fade-xfade.capx

Download now 852.51 KB
.capx

xfade-emitters.capx

Download now 1.3 MB

Stats

5,624 visits, 8,719 views

Tools

Translations

This tutorial hasn't been translated.

License

This tutorial is licensed under CC BY 4.0. Please refer to the license text if you wish to reuse, share or remix the content contained within this tutorial.

Published on 14 Jul, 2017. Last updated 19 Feb, 2019

On both technical and artists fronts, game audio composers and sound designers strive to create a seamless sonic experience for players. When a player makes a choice that affects the game world, they expect to see and hear the consequences of that choice. When this kind of effect is done well players don’t notice it. They carry on with their mission in blissful ignorance of the technical challenges solved by designers and developers. Their careful work makes the game look and sound perfect. When their efforts fall short however, players’ experiences are not seamless, and they can feel like they were "pulled out" of the game world. Technical and aesthetic oversights will break the continuity of experience. We, as designers and producers of games, want to prevent these kinds of disruptions. Hiccups and glitches can turn a brilliant, well-conceived game into a frustrating experience.

When it comes to preserving continuity of experience in game audio, one of the most useful techniques is the crossfade. If you’re not familiar with the term by name you have surely heard it used many times before. When music or a recorded sound environment plays; then gradually fades away as it is simultaneously replaced with a different music track or environment, you can call it a crossfade. The name reveals how the effect is accomplished on a technical level. The volume level of one element is reduced (fades out) as another gradually increases in volume (fades in) across each other, like boats passing in a canal. When the effect is done well the sonic experience is seamless. Players hear that something has changed and see that change reflected in the form of a new environment or other condition in their game world. This coherence preserves an overall continuity and helps to keep players invested.

Construct does not offer any default Actions to crossfade audio, but it is possible to use other means to create the effect. This lesson shows two possibilities: One uses Construct’s positional sound capabilities while the other uses a lerp() (linear interpolation) function. Both work; both have individual strengths and weaknesses. Ultimately it depends on the unique demands of your game to determine which best meets your needs.

Positional Audio Crossfade

The Event Sheet for a positional audio crossfade looks like this:

You can also download the xfade-emitters.capx file from the sidebar on the left.

In order to understand how this works it’s helpful to be familiar with positional audio in Construct. As Ashley Gullen (one of Scirra’s founders) notes,

"Think of a directional sound as a cone: relative to the direction the object is pointing in, the inner angle is all max volume, the outer angle is at minimum volume, and the bit in between smoothly transitions from one to the other." (source)

If that doesn’t help, we will look at something a little later to put the idea of "sound cones" in a specific context.

If a technical or theoretical explanation isn’t sufficient, use your ears! Put on headphones and spend some time with the Audio Positioning example that ships with Construct. This demo file uses several different techniques to place sounds in a game world. Listen as you play through these examples (again, with headphones—all laptop and most desktop speakers can’t do this justice) and examine the Event sheet to solidify your understanding of how what you see and do in this example affects what you hear.

At the top of the Event sheet we have four Global variables: outerAngle, innerAngle, activeStateVol, and inactiveStateVol. While variables aren’t required, I find that it’s easier to put these values into variables so that when you are fine-tuning your audio, it’s easy to update once and the changes will be applied to every Event or Action that requires them. It’s a best practice that saves time and guards against mistakes.

activeStateVol, and inactiveStateVol set the upper (0 dB) and lower (-60 dB) volume level for a positioned sound, while outerAngle and innerAngle set the degree of two angles at 200 and 160 respectively.

A lone, System event sets up the rest:

The first Audio action uses Set listener object to define player as a point of audition. A Listener allows audio sources in the game to be heard from its position or perspective. In this case, sound-emitting objects near player will be louder than those at a distance.

The second Audio action uses Play at object to turn the hills object into an audio emitter: sound plays from this object as if a speaker were attached. Double-click the Action to see that it plays a file called CM-seamless in looping mode. The volume level is defined by activeStateVol, while the other variables innerAngle, outerAngle, and inactiveStateVol define the other parameters of this Action.

The third Audio action does the same thing, but for a different object called sky that plays the sound Eb-seamless.

Take a closer look at the Layout tab for this project and you can see that the hills and sky objects are on their own layer (called targets) that is hidden beneath the background layer. This arrangement allows these object to serve their technical purpose without having any visual representation in the game world. The idea behind this example was to be able to move the player character across the mountain range and hear a musical shift as the character climbed from the hills towards the sky and back down again.

Given Ashley’s description of a sound "cone," and the elements of this Event sheet, Construct has created something like this:

Speaker icon illustrated by Silviu Runceanu, CC 3.0 BY

In a game, your character (or whatever represents you in the game world) moves around the object at the center which plays (emits) a sound from its location. The yellow wedge, or 160º innerAngle shows the area in which that sound will be heard at full volume, or 0 dB as specified by activeStateVol. The larger, 200º blue line is set by outerAngle. At the edge of the blue line, the sound level will drop to -60 dB (inactiveStateVol). As player moves across the green cone between the bounds of the yellow wedge and blue line, the sound of that emitter will transition from 0 to -60 (or if going the opposite direction, -60 to 0).

In the example discussed here, there are two objects positioned in opposite directions. Leaving the 160º and 200º wedges of one means you simultaneously enter the wedges of the other to create a crossfade effect. The specific application of the technique looks like this:

This technique was developed to face the specific challenge of creating smooth sound transitions as players move from place to place in a game world. I’ve had a few students use this technique and it worked well for them, though I don’t know the full extent they had to go in order to wrangle this basic technique into their specific games. Hopefully this will provide a useful starting point in your own designs and allow sounds to smoothly and seamlessly crossfade to match the interactions of your game.

  • 1 Comments

  • Order by
Want to leave a comment? Login or Register an account!
  • Thank you for this!

    While this doesn't answer my immediate goals of researching sound and music options in a game, it definitely gives me a lot to be mindful of. I'll be referring to this later on during my project.