Audiolink was introduced in UE.5.1 in 2023 and aims to function a bit like Propellerheads Rewire ((Anderton, 2019) allowing audio to be routed between Metasounds and any app that supports the protocol. Wwise currently has the most documented integration for this although that statement should be taken with some latitude as there’s not a huge amount of information on setup and functionality. So far the best guide I’ve found is a post titled “Adventures With AudioLink on the AudioKinetic blog” (Audiokinetic Inc, 2023). FMOD also supports this software protocol and as far as I can tell has a similar process (and limitations) as discussed in this post, however, I’ve not explored the FMOD/Audiolink combination practically yet (FMOD – AudioLink, 2024).
Using Audiolink at the moment is very much a question of exploration and iteration but it promises a way of combining the procedural possibilities of Metasounds with the vast implementation functionality of a middleware like Wwise that is very exciting. While Metasounds do offer sample playback in this example I’ve focused on the synthesis aspects and used Wwise for wave playback.

Word of warning
When using Audiolink I encountered several random loud bursts of noise at ear bleeding amplitude. I’ve not been able to work out what’s causing this but I used the below precautions when working. Note these are only recommendations based on my experience and there may well be a more robust technical approach.
- Always work at low playback levels, especially after adding new Audiolink objects into the project and on the first play of a new work session.
- Restart UE and Wwise after creating a new Audiolink object.
- Watch out for sudden bursts of noise after adding reverb zones (restart is also advisable).
- Add a limiter on the master output in the Wwise project.
Setting up Audiolink with Wwise
This process is also covered in the Wwise documentation under “Combining Unreal 5 and Wwise Audio with AudioLink “ (Audiokinetic Inc, 2023).
Most of this process happens in the UE engine and Wwise simply uses the “Audio Input” object to source the sound through its signal flow. As this is a software protocol the functionality first needs to be enabled under the Wwise integration section of the UE Project Settings window by selecting “ Route through AudioLink [5.1+].” under “Audio Routings”.
The plugin uses either the Submix or Attenuation overrides to pass the audio to the destination (Wwise) but I’ve focused on the latter for this project as I also wanted to explore occlusion and obstruction, more on this later.
The signal path in UE5 as used in this project is seen in Figure 1. Each step can be added via the UE editor and concluded by selecting the Wwise event in the last Audiolink send. For my project, I found that setting the “Producer to Consumer Buffer Ratio” slightly higher reduced any glitches but this is likely to vary from system to system.
UE settings and spatilisation
In the attenuation override the only setting that needs to be engaged for the sound to be sent to Wwise is the Audiolink override in the Attenuation settings in the Metasounds patch, see Figure 2. But you’re likely to also want to check the “Attenuation Spatialization as this allows this option to be used, or not, from the Wwise project, see Figure 3.
Trying out the different options for 3D sound I did not find that the UE attenuation settings had any particular effect but checking the box allowed for spatilisation and attenuation settings to be configured in Wwise. At this point, I’m unclear if and how the UE settings are affecting 3D positioning outside of enabling this to be set and adjusted from the target app.
Obstruction/occlusion
While UE obstructs audio and allows for virtualization I could not find a way to send the silent but playing signal to Wwise so the sound is simply cut off once you pass a wall. UE allows for setting a fade-out time so it’s quite easy to create a natural feel to the dissipating sound and while I tried many different options I was not able to use Wwise obstruction system with the audio input object.
As there’s seemingly no method to manipulate this outside of using Unreal’s spatilisation system this means that it’s currently not possible to exclusively create an automatic occlusion/obstruction audio system in Wwise (or the target app) when using the Audiolink protocol. However, using RTPC’s it’s relatively easy to set up level / EQ automation that allows for sound to adapt to the environment that the player character is moving through as can be seen below and in the video at the top of the page.
Metasounds procedural possibilities
Maybe the most exciting aspect of Metasounds is the ability to create adaptive audio that can utilise synthesize methods to sculpt sound for various uses in a game and other environments. In this example, the player is guided to the correct path by manipulating an ADSR and other processes in the metasound patch.
Demonstrated in the video is also a drone bass patch that dips in pitch accentuated by a filter sweep.
This can of course be achieved using concatenating sound files in the middleware but the ability to manipulate the sound directly using methods like subtractive and FM opens up possibilities that extend beyond sample playback. As an example, as the player approaches the control panel the dimming light lerp values are used to play through several oscillators each with a rising pitch. The Metasounds patch of this is demonstrated below as is the drawback that it’s not possible to audition the patch outside of gameplay so adjustments to the sound design mean a continuous level balancing as engaging submix means the sound is played back via two systems.
Summary
Unreal Engine 5.1 Audiolink presents an innovative approach to audio routing between Metasounds and compatible applications, offering exciting possibilities for combining procedural audio with middleware functionalities like Wwise. However, its current usage is somewhat challenged due to occasional sudden noise bursts. Recommendations include working at low playback levels, restarting applications after creating new Audiolink components, and adding limiters to master outputs in Wwise.
Setting up Audiolink with Wwise involves enabling the functionality in Unreal Engine settings and configuring signal paths. It’s also unclear what parts of UE’s spatialisation settings affect 3D functions in the middleware and how integration obstruction/occlusion methods are or can be managed.
Despite these challenges, Metasounds offer exciting prospects for adaptive audio through procedural synthesis methods. Examples include guiding players with manipulated ADSR parameters and creating dynamic effects like filter sweeps. However, adjusting sound design requires continuous balancing due to limitations in auditioning Metasound patches outside of gameplay.
References
Anderton, C. (2019) Upgrade Your DAW with ReWire, inSync. Available at: https://www.sweetwater.com/insync/upgrade-your-daw-with-rewire/ (Accessed: 19 March 2024).
Audiokinetic Inc (2023a) Adventures With AudioLink | Audiokinetic Blog. Available at: https://blog.audiokinetic.com/en/adventures-with-audiolink/ (Accessed: 19 March 2024).
Audiokinetic Inc (2023b) Combining Unreal 5 and Wwise Audio with AudioLink. Available at: https://www.audiokinetic.com/en/library/edge/?source=UE4&id=using_audio_link.html (Accessed: 19 March 2024).FMOD – AudioLink (2024). Available at: https://fmod.com/docs/2.02/unreal/audiolink.html (Accessed: 19 March 2024).
Leave a Reply