Skip to content
rust edited this page Mar 28, 2024 · 7 revisions

Audio is added to 3D applications either as background music or as UI feedback. If it makes sense for something in a 3D app to cause a noise, then a noise should happen.

3D audio is the ability to play sound at a virtual location. A single listener, representing the user, is set at a virtual location and will hear more or less of a sound depending on the distance of the listener to where the sound occured.

Audio is recorded and stored in a digital format. The vu engine can load the WAV format. The engine is responsible for loading audio files and presenting them to the audio device hardware.

Sounds are added directly to the engine using Engine.AddSound. The 3D application is reponsible for loading the sound asset and telling the engine where to play the sound on an entity that has a position and orientation in virtual space.

var collideSound uint32
    ...
    eng.ImportAssets("collide.wav")        // load asset from disk
    collideSound = eng.AddSound("collide") // add collide sound asset.
    ...
    player.PlaySound(collideSound)

See Entity.PlaySound.

The 3D application must also place the sound listener at a specific location by assigning the listener to a specific listener.

   player.SetListener()

See Entity.SetListener.

Sounds are played on device specific audio hardware. Specificaly sounds are played to a device layer API which then does the hard work of playing the sound by interfacing with the audio hardware. The vu engine uses the OpenAL audio API. Audio device level code is the reponsibilty of the vu engine vu/audio package.

Clone this wiki locally